=Paper= {{Paper |id=None |storemode=property |title=Interactively Displaying Maps on a Tactile Graphics Display |pdfUrl=https://ceur-ws.org/Vol-888/SKALID2012_Schmitz.pdf |volume=Vol-888 }} ==Interactively Displaying Maps on a Tactile Graphics Display== https://ceur-ws.org/Vol-888/SKALID2012_Schmitz.pdf
                Interactively Displaying Maps
                on a Tactile Graphics Display

                       Bernhard Schmitz and Thomas Ertl

      Institute for Visualization and Interactive Systems, Universität Stuttgart
             {Bernhard.Schmitz, Thomas.Ertl}@vis.uni-stuttgart.de



      Abstract. We present a system that allows blind users to explore both
      indoor maps and large-scale OpenStreetMap data on a tactile graph-
      ics display. Additional information about the map object that is being
      touched by the user’s finger is provided by text-to-speech output. Dif-
      ferent styles allow to concentrate on specific aspects of the map. First
      tests show that the system can help blind users to get an impression of
      the layout of unknown areas, and even to get a better understanding of
      areas that are well-known to them.

      Keywords: Tactile Maps, Tactile Graphics Display, Accessibility


1   Introduction

Being able to read maps is an important first step towards successful navigation
in unknown areas. While maps are becoming ever more available for sighted
persons via smartphones and services such as Google maps, blind users often
still have to rely on maps that are provided specifically for them, and are often
only available for small areas. At the same time, digital maps are becoming
more widely available through initiatives such as OpenStreetMap. What is still
missing is the link between those digital maps and the blind users. Screenreaders
and Braille devices have adopted the role of this link for digital information in
text format, but for spatial information such an accessibility is still missing. In
this paper we present a system that is designed to provide this link, and to make
maps – especially OpenStreetMap – accessible to blind users.


2   Related Work

With the advance of widespread availability of digital maps, making them acces-
sible for all is a logical next step. Accessible maps can be classified according to
several criteria, including the sensory channel that they use (e.g. tactile vs. au-
ditory maps). Another important distinction is whether they represent a larger
two-dimensional overview of the map or an approach based on a virtual ob-
server, where only information in the immediate vicinity of a freely movable
virtual observer (or cursor) is presented to the user.

Proceedings SKALID 2012                                                             13
                                                                        Schmitz & Ertl




Fig. 1. The tactile graphics display, including a braille keyboard on top and two four-
way digital crosses.


    Purely auditory maps are normally bound to the virtual observer model,
such as the auditory torch by Heuten et al. [2, 1]. In contrast to that, the works
on tactile or combined tactile and auditory maps vary between those using a
virtual observer model and those representing a larger overview at once. Rice
et al. combine tactile and auditory feedback in their system that controls the
virtual observer by a force feedback mouse [5]. Our own previous work includes
a virtual observer controlled by a rumble gamepad [7].
    Systems that present a larger overview often require more elaborate hardware
and a setup that is tied to desktop use. Wang et al. print out a certain area of
the map with a thermal embosser [8]. The printout is then placed on a touchpad,
allowing audio feedback upon the touch of the user’s finger. A similar approach
was used by Paladugu et al. with the goal of evaluating design patterns for the
production of tactile maps [4]. The system by Zeng and Weber is most similar
to the one presented in this paper, also using a tactile graphics display [9]. The
system uses an inbuilt GIS and renders roads as lines and buildings and other
points of interest as fixed symbols from a library. Our system aims at greater
flexibility by using OpenStreetMap and freely choosable styles.


3    The Tactile Graphics Display

Our system displays the maps on a Tactile Graphics Display, the “Stuttgarter
Stiftplatte”, with a resolution of 120x60 pins and touch-sensitive sensors for
feedback about the position of the user’s fingers [6] (see Figure 1). However, our
system does not specifically build maps for this display, e.g. by directly activating
specific pins. Instead, normal graphics output is used, and the driver converts
the output for use on the Tactile Graphics Display. With this approach, the
system is not limited to a specific Tactile Graphics Display, instead any that can
convert on-screen graphics can be used.
    Because of this approach some graphical details might be lost during the
conversion. This can mostly be avoided by choosing appropriate styles for the
graphics display (section 6.1).

14                                                      Proceedings SKALID 2012
Interactively Displaying Maps on a Tactile Graphics Display


4     Maps
Our systems displays two different kinds of maps: Highly detailed maps of build-
ings and small outdoor areas and OpenStreetMap data for an overview of large
outdoor environments.

4.1    Detailed Maps
The detailed maps are handbuilt and were originally made for the ASBUS
project, which among other goals aims at making the University of Stuttgart
accessible to disabled and especially blind students by providing a navigation
system. The maps show buildings with rooms and even small details like pillars
and benches. The maps are stored in XML files based on the CityGML standard,
but limited to two dimensions.

4.2    OpenStreetMap
For large outdoor areas where no detailed maps are available, OpenStreetMap
data is used. A certain area around the current viewport is downloaded. If the
user scrolls out of that viewport, new data is downloaded automatically.
    OpenStreetMap is a community effort in providing maps that is a viable al-
ternative to commercial data providers, especially regarding pedestrians: In 2011
OpenStreetMap provided a more than 30% larger street network for pedestrian
navigation in Germany than the commercial TomTom Multinet 2011 database [3].
OpenStreetMap data consists of three types: Nodes, ways and relations. Nodes
are simple points on the map, ways are linestrings that connect the nodes and
relations can contain both nodes and ways. All three can have an arbitrary num-
ber of key-value string pairs called tags, in which the type of the object, but also
any additional information is stored.


5     Interactions
As in any common map application, the user may zoom and scroll the map by
using the arrow keys or the four-way digital cross of the tactile graphics display.
The main additional feature is that the user can click on any object by pressing
a button while having the finger on the object. Its name, function or address
is then read out by a text-to-speech engine. If a handbuilt map is used, this is
straightforward, as all displayed objects are named. However, in a community-
based environment, in our case OpenStreetMap, the data is not always present in
such a straightforward manner. Therefore, in order to be helpful to the user, the
text that is read out has to be chosen more diligently. If the object is tagged with
a name tag, the name is read out directly. If no name is given, a combination
of the type of the object and (if available) the address is read out. The type
of the object is determined by the OpenStreetMap tags in combination with
strings from the styles (section 6.1), that also allow a translation into other

Proceedings SKALID 2012                                                          15
                                                                    Schmitz & Ertl


languages. If several objects are stacked on top of each other in OpenStreetMap,
the objects are read out one by one after each consecutive click, beginning with
the innermost. For detailed maps, consecutive clicks are equivalent to going up
one step in the GML hierarchy, so that e.g. a building’s name will be announced
after a room number.
    Some features of the system can be accessed by a menu. After opening the
menu with a keystroke, the user can navigate through the menu with the cursor
keys or the digital cross. The individual menu items are read out by the text-
to-speech engine. The menu allows access to the various maps (section 4), the
different styles (section 6.1), and to the place search (section 6.2).


6     Additional Features

While the system as described above is functional, some additional features were
implemented in close collaboration with a blind colleague, who currently uses
the system most frequently. Those features can greatly enhance the usability of
the system.


6.1   Display Styles

The Display component of our system is completely configurable. This means
that for all tags in OpenStreetMap the color and thickness of the lines that
are drawn, as well as the color and hatching of polygonal objects can be freely
chosen. Objects can also be completely hidden, depending on their tags. Details
like color and hatching cannot be reproduced on the tactile graphics display,
and are mainly useful for collaboration with sighted users that use a monitor.
Line thickness or the hiding of objects can be used to avoid clogging the tactile
graphics display with too much information. These style settings can be stored
and loaded and also added to a quick styles menu, allowing easy selection of
different styles, such as “only buildings” or “only streets”. Figure 2 shows a
detail of an area near our University campus, showing both buildings and streets
(a, b), only buildings (c, d), and only streets (e, f).


6.2   Place Search

The place search allows entering the name of cities, streets or well-known enti-
ties such as landmarks. The search string is forwarded to both GeoNames and
OpenStreetMap Nominatim. The places found by both services are presented to
the user in an accessible list, that can again be accessed with the cursor keys.
Upon selection of an entity, the map is changed to OpenStreetMap (if it is not
already the case), and centered on the geographic position of the selected entity.
This allows switching fast between different areas of interest.

16                                                   Proceedings SKALID 2012
Interactively Displaying Maps on a Tactile Graphics Display




     (a) Buildings and streets (screen)     (b) Buildings and streets (tactile graph-
                                            ics display)




         (c) Buildings only (screen)        (d) Buildings only (tactile graphics dis-
                                            play)




          (e) Streets only (screen)         (f) Streets only (tactile graphics dis-
                                            play)

Fig. 2. A map shown with three different styles on both the computer screen and the
tactile graphics display.


7    Results
First tests have shown that the system can greatly enhance the spatial under-
standing of its users. A blind colleague who uses the system says that she has
had to correct her understanding of the street layout even of areas that she has

Proceedings SKALID 2012                                                             17
                                                                           Schmitz & Ertl


lived in for a long time. The possibility to quickly jump to a desired location was
regarded positively, as it enabled the user to go from exploring one area, such as
the workplace, to another, e.g. the place of residence. Furthermore, the different
styles were regarded as very helpful to reduce information overload depending
on specific tasks, e.g. our colleague chose to hide all buildings when exploring
the layout of streets. The use of OpenStreetMap has the advantage of providing
a worldwide data set. Especially in conjunction with the flexibility achieved by
the different styles, this provides a main difference from many of the previous
works in this area. The effects of the low resolution can be diminished by us-
ing appropriate styles, e.g. by rendering only streets of a certain importance if
zoomed out. However, the effects of using different styles for different zoom levels
still need to be evaluated.

Acknowledgments. This work was funded by the Deutsche Forschungsgemein-
schaft (DFG). We would like to thank Andrea Berghammer for her involvement
in the design process and her invaluable feedback.

References
1. Heuten, W., Henze, N., Boll, S.: Interactive exploration of city maps with auditory
   torches. In: CHI ’07: CHI ’07 extended abstracts on human factors in computing
   systems. pp. 1959–1964. ACM, New York, NY, USA (2007)
2. Heuten, W., Wichmann, D., Boll, S.: Interactive 3d sonification for the exploration
   of city maps. In: NordiCHI ’06: Proceedings of the 4th Nordic conference on human-
   computer interaction. pp. 155–164. ACM, New York, NY, USA (2006)
3. Neis, P., Zielstra, D., Zipf, A.: The street network evolution of crowdsourced maps:
   Openstreetmap in germany 20072011. Future Internet 4(1), 1–21 (2012)
4. Paladugu, D.A., Wang, Z., Li, B.: On presenting audio-tactile maps to visually
   impaired users for getting directions. In: Proceedings of the 28th of the international
   conference extended abstracts on human factors in computing systems. pp. 3955–
   3960. CHI EA ’10, ACM, New York, NY, USA (2010)
5. Rice, M., Jacobson, R.D., Golledge, R.G., Jones, D.: Design considerations for hap-
   tic and auditory map interfaces. Cartography and Geographic Information Science
   (CaGIS) 32(4), 381–391 (2005)
6. Rotard, M., Taras, C., Ertl, T.: Tactile web browsing for blind people. Multimedia
   Tools and Applications 37(1), 53–69 (2008)
7. Schmitz, B., Ertl, T.: Making digital maps accessible using vibrations. In: Proceed-
   ings of the 12th international conference on computers helping people with special
   needs (ICCHP 2010). Lecture Notes in Computer Science, vol. 6179, pp. 100–107.
   Springer Berlin / Heidelberg (2010)
8. Wang, Z., Li, B., Hedgpeth, T., Haven, T.: Instant tactile-audio map: enabling
   access to digital maps for people with visual impairment. In: Proceedings of the
   11th international ACM SIGACCESS conference on computers and accessibility.
   pp. 43–50. Assets ’09, ACM, New York, NY, USA (2009)
9. Zeng, L., Weber, G.: Audio-haptic browser for a geographical information system.
   In: Proceedings of the 12th international conference on computers helping people
   with special needs (ICCHP 2010). pp. 466–473. Lecture Notes in Computer Science,
   Springer-Verlag, Berlin, Heidelberg (2010)



18                                                        Proceedings SKALID 2012