<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <issn pub-type="ppub">1613-0073</issn>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Inclusive Navigation Systems: Perspectives and Challenges for the Visually-Impaired</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Dimitri Belli</string-name>
          <email>dimitri.belli@isti.cnr.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Paolo Barsocchi</string-name>
          <email>paolo.barsocchi@isti.cnr.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Antonino Crivello</string-name>
          <email>antonino.crivello@isti.cnr.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Francesco Furfari</string-name>
          <email>francesco.furfari@isti.cnr.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Barbara Leporini</string-name>
          <email>barbara.leporini@di.unipi.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Maria Teresa Paratore</string-name>
          <email>mariateresa.paratore@isti.cnr.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Workshop</string-name>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Computer Science, University of Pisa</institution>
          ,
          <addr-line>Largo B. Pontecorvo 3, Pisa</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Institute of Information Science and Technologies (ISTI), National Research Council (CNR)</institution>
          ,
          <addr-line>Via G. Moruzzi 1, Pisa</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Despite significant advances in technology, the area of mobility and orientation for visually impaired persons continues to present significant challenges. Digital maps have become essential for navigation, but their usability is often compromised for users who rely on assistive technologies, especially when accessed on small touch screens. This calls for innovative approaches to making digital maps more accessible and usable, as these tools are crucial for creating mental maps of navigational spaces. This paper explores the need for inclusive localization and positioning systems that accommodate a wide range of users, including those with visual impairments. It highlights the critical role of user context, such as device experience and positional awareness, in improving the usability of these systems. The integration of haptic and audio feedback may ofer promising new interaction methods, although further development is needed. In addition, user interface design and system characteristics such as security, robustness and usability need to be aligned with user acceptance, with a focus on low cost and simplicity. Our analysis identifies key requirements for the design of inclusive systems and proposes steps for the scientific community to take to advance the field, with the aim of bridging the gap between technological capabilities and practical usability, and promoting inclusive design principles for future innovation.</p>
      </abstract>
      <kwd-group>
        <kwd>aids</kwd>
        <kwd>Inclusivity</kwd>
        <kwd>Indoor localization</kwd>
        <kwd>Orientation and mobility</kwd>
        <kwd>Accessibility</kwd>
        <kwd>Blind and visually impaired</kwd>
        <kwd>Navigation</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The terms “orientation” and ”mobility” are related to the concepts of wayfinding and locomotion,
respectively, and identify the fundamental components of spatial decision making [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Good ’orientation’
and ’mobility’ skills are very important for a visually impaired person who moves independently.
Orientation refers to the skill of planning a route from the current position to a given destination, thus
requiring knowledge of the whole area and its relevant landmarks, in a mental representation of the
environment (a.k.a. ”cognitive map”) [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Mobility, on the other hand, involves all the actions that
an individual takes in direct response to the physical characteristics of the environment as he or she
traverses it (e.g. avoiding an obstacle, descending or ascending stairs), and thus depends only on the
perception of the immediate environment at a given moment. Knowing in advance the structure of
a new indoor or outdoor environment and where its points of interest are located can help visually
impaired visitors find their way around and quickly reach their destination. Tools such as tactile maps,
3D physical models and accurate text descriptions help users to build a cognitive map of the space to be
explored. A preliminary physical exploration, either with the help of a person or relying solely on the
traditional white cane, is also a valuable aid.
      </p>
      <p>Proceedings of the Work-in-Progress Papers at the 14th International Conference on Indoor Positioning and Indoor Navigation
(IPIN-WiP 2024)
(M. T. Paratore)</p>
      <p>CEUR</p>
      <p>ceur-ws.org</p>
      <p>
        Nowadays, most public spaces provide visitors with navigation tools, such as digital signs, websites,
or mobile applications. Unfortunately, these tools often pose accessibility problems for visitors with
special needs. For example, digital maps integrated into mobile applications typically lack features to
support accessibility for persons with visual impairments [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], yet, accessible navigation mobile apps
have proven to be efective assistive solutions for this category of user [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. In [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] are discussed the daily
challenges faced by visually impaired persons and pointed out how ICT, and more specifically assistive
technologies, can help them achieve greater social inclusion and autonomy.
      </p>
      <p>
        Designing an application for visually impaired persons requires special hardware infrastructure
and positioning techniques, adapted to the characteristics of the environment (e.g. wall thickness and
geometry) and the specific needs of the users [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. Commonly adopted infrastructures include Wi-Fi,
Radio-frequency identification (RFID), Bluetooth Low Energy (BLE) 1, Long Range (LoRa) communication,
and Ultra Wide Band (UWB), sometimes combined in hybrid systems [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
      </p>
      <p>
        Recently, the growing popularity of AI algorithms, has led to their integration into several assistive
solutions in the field 23. In particular, deep-learning algorithms for image recognition have been used
to empower the traditional white cane [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] to facilitate object recognition and obstacle avoidance, or
integrated into mobile apps to simplify wayfinding and orientation tasks 4. Impairments can pose
challenges to the accessibility of human-machine interfaces potentially impacting the overall user
experience [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. In a mobile application, user interfaces need to be carefully designed to prevent
accessibility issues; for instance, avoiding pop-ups is critical when creating GUIs for visually impaired
users. Personalization of information (i.e., choosing which information to present and how to deliver
it) is another aspect that must be considered. Visually impaired persons commonly use non-visual
channels to perceive their surroundings [9]. While exploring an environment, blind subjects usually
rely on the sounds they hear and the obstacles they discover using the white cane, in order to identify
and remember landmarks, and to better orient themselves [
        <xref ref-type="bibr" rid="ref2">2, 10</xref>
        ]. Other signals such as smells can
also be associated with landmarks, and may become valuable aids for orientation. Moreover, blind
persons are not inclined to rely on assistive technology exclusively, as they do not want to give up the
traditional white cane to move around more safely. On the other hand, partially sighted individuals
may have a diferent approach, as they can make use of their residual vision. Besides these general
considerations, each individual has their own unique preferences influenced by factors like age, gender
and personal experiences [11]. For example, not all visually impaired subjects are equally responsive
when it comes to the sense of touch, depending on their age or familiarity with tactile devices. For
an orientation and mobility system to be efective, it is not only necessary to have a robust software
and hardware infrastructure, but also a properly designed user experience for the related software
applications and devices. To this end, the adoption of co-design strategies in a preliminary phase may
prove fundamental [12, 13].
      </p>
      <p>As the performances of smartphones have increased, they have gained popularity as tools to enhance
the independence of individuals with visual impairments. Mobile applications also benefit from the
large distribution capacity provided by app stores on the Internet. Thanks to features such as vibration
motors, text-to-speech (TTS) and accessibility services, smartphones ofer cost-efective solutions for
providing information through tactile and auditory channels [14]. However, in many cases, audio
alerts can be intrusive or dificult to hear in noisy environments. In addition, the vibrations provided
by a smartphone may not be suficient to guide a user through a complex physical environment, and
visually impaired users may find it cumbersome to hold a smartphone in one hand and a guide dog or
white cane in the other. There are many assistive devices designed to overcome these problems, with
software and hardware tailored to the computational demands of traversal and localization problems.
Some of these devices are designed as stand-alone modules, while others can be integrated by mobile
apps. Their rather high cost is, however, a limiting factor to their difusion. Examples of such devices
include smart glasses and other wearables based on the haptic channel. Assistive smart glasses exist,
1The oficial iBeacon documentation: https://developer.apple.com/ibeacon/
2Seeing AI: https://www.seeingai.com/
3Be my AI: https://www.bemyeyes.com/blog/introducing-be-my-ai
4VoiceVista: https://www.applevis.com/apps/ios/navigation/voicevista
which may also come equipped with bone conduction headphones to reduce the burden of auditory
stimuli, or integrated with software libraries for augmented reality or artificial intelligence. 5 Assistive
wearable belts are another type of hands-free device, consisting of a series of actuators6, which are
capable of transmitting fine-grained directional details. This work highlights several key issues related
to the development of assistive navigation applications for visually impaired persons, issues that should
receive more attention from the scientific community. The articles we have identified and discussed
have been carefully selected to underscore specific aspects such as user requirements, user experience,
usability, and accessibility of assistive navigation devices. While this work does not aim to be an
exhaustive or systematic review of the literature on the subject, it serves as a compass, a preliminary
efort designed to guide and inspire future, in-depth studies that will expand our understanding and
improve the development of these technologies.</p>
      <p>The remainder of the paper is organized as follows: Section II outlines the key requirements for
defining a usability-centered and inclusive assistive navigation solution. Section III connects these
requirements to the context of indoor localization. Finally, Section IV and V respectively summarize
the main open challenges in this area and draw conclusions.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Understanding user needs: Requirements for an inclusive experience</title>
      <p>Designing an assistive device for users with special needs implies focusing on their expectations, in
order to develop hardware and software solutions which account for an efective and satisfactory user
experience. In[15], the authors conducted a survey involving twenty-five blind users, and highlighted
several important observations: blind persons prefer interactive tactile maps to navigate towards
distant destinations or specific nearby objects, such as elevators, stairs, entrances, room numbers, and
exits. Additionally, the survey found that blind users favor sound-guided navigation outdoors and
tactile navigation indoors. Another notable outcome of this survey is that blind individuals often face
challenges in locating the correct room and finding elevators or stairs indoors. The latter is qualitatively
important for applications designed to help visually impaired individuals move independently [16]. Key
specifications for these applications include:
1. Minimizing the information load provided to the user through feedback, using auditory and tactile
channels with the least intrusiveness possible, hopefully considering hands-free and ears-free
solutions.
2. Ensuring that the device is lightweight, easy to wear and transport, without restricting the user’s
freedom of movement.
3. Balancing the complexity of the system with a long battery life to ensure efective operation over
a long period.
4. Designing intuitive interfaces that require a short learning curve to minimize the efort necessary
to understand and use the device.
5. Ensuring safety, security, and reliability, so as to establish a sense of trustworthiness in the user,
rather than anxiety towards technology. For this reason, privacy and security aspects related to
sensitive information must be considered thoroughly.</p>
      <p>The literature commonly distinguishes between outdoor and indoor navigation systems, each
addressing specific and distinct user requirements. While solutions for seamless navigation between these
environments do exist, they are still in the early stages, with many challenges yet to be addressed, and
nothing specific on the topic [ 17, 18].</p>
      <p>As reported in [19], outdoor urban navigation systems typically involve three main tasks:
environmental mapping, route planning, and real-time navigation. User requirements for these tasks vary.
5IrisVision:https://irisvision.com/electronic-glasses-for-the-blind-and-visually-impaired/
6FeelSpace: https://feelspace.de/en/</p>
      <p>User Requirements</p>
      <p>Compass (orientation)</p>
      <p>Route planning
Haptic and</p>
      <p>Audio feedback
Buildingmapping Indoor Navigation</p>
      <p>Outdoor Navigation
Door and Window
localization</p>
      <p>Staircase</p>
      <p>and
Lift/Elevator
localization
Pedestrian routing</p>
      <p>Public transportation
routing
Pedestrian traffic light</p>
      <p>detection</p>
      <p>Sidewalk detection</p>
      <p>Crosswalk detection</p>
      <p>Zebradetection
Localization</p>
      <p>Obstacle avoidance
Multipleinput channel Multi-devicesupport
(haptic and audio) (portability)</p>
      <p>For environmental mapping, they include identifying road junctions, trafic lights, pavements, zebra
crossings, and transport options. For route planning, they include selecting locations and routes, and
choosing the type of path (pedestrian or public transport). For real-time navigation, requirements range
from understanding the environment through contextual information (e.g., distance to destination,
travel time and so forth) to identifying obstacles. User requirements for indoor navigation systems for
blind and visually impaired persons, instead, can be categorized into functional and usability
requirements, and route description requirements [20]. Concerning functional and usability requirements,
blind and visually impaired users prioritize a navigation support system with functionalities like route
planning across devices (mobile and desktop), saving destinations for later use, and receiving current
location information with detailed descriptions of point-of-interest and environment facilities like
stairs, elevators, toilets, cofee and snack vending machines. They also require clear feedback on inputs,
advanced obstacle warnings (tactile and audible), route retracement options, wrong-turn alerts, and
the ability to store and share points of interest. As for route descriptions, personalization, concise
summaries, functional waypoint markers, and integration of environmental cues (tactile, auditory,
olfactory) are crucial. While mobile devices are preferred for navigation, some users value desktop
planning for its ease of information input. Fig.1 summarizes the main user requirements presented above
for indoor and outdoor navigation solutions developed for assisting blind and visually impaired persons.
Although addressing both outdoor and indoor environments, this work concentrates specifically on
user requirements for indoor localization.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Translating user requirements in indoor localization context</title>
      <p>To deploy an inclusive localization system by design, we need to address all the requirements outlined
in the previous section. Our proposal for an inclusive indoor localization system focuses on maximizing
the accuracy of the user’s location at the room level and precisely identifying transition areas such
as elevators and stairs, using interactive and tactile maps rather than relying solely on sound-guided
navigation. Our vision emphasizes the importance of identifying landmarks as easily recognizable
objects that serve as external reference points. Elevators and stairs are primary examples of these
landmarks, but other objects defined within a tactile map can provide valuable information throughout
the wayfinding and navigation process. This approach is consistent with recent findings in [ 21], which
identified a set of landmarks for outdoor scenarios. Consequently, our maps are designed to help users
locate landmarks and rooms through tactile means. By integrating these features, our proposal for a
localization system aims to provide a highly accurate and user-friendly solution. This ensures that
visually impaired and blind persons can navigate indoor environments with greater independence
and confidence. Importantly, this approach addresses the privacy awareness and trustworthiness
requirement by providing a system that is not only reliable and secure, but also fosters trust in the
technology. The use of tactile maps enhances the user experience by making navigation intuitive
and reducing the anxiety related to technology use. Moreover, the design of the system prioritizes
privacy and the secure handling of sensitive information, further contributing to a safe and trustworthy
navigation aid.</p>
      <p>We propose that the inclusive maps should be primarily tactile, produced for key areas to allow
blind users to feel their way around the map. These maps should be enriched to work seamlessly with
screen readers, providing audio descriptions of landmarks where appropriate, while prioritizing the
tactile cues. Users can receive spoken instructions and descriptions of their surroundings to help them
navigate. The color schemes used for the digital maps should emphasize high contrast to ensure that
users with low vision can distinguish between diferent elements. Important text and symbols should be
displayed in larger fonts to improve readability. Map layout should be kept deliberately simple, focusing
on essential routes and landmarks. Non-essential details should be minimized to avoid confusion and
ensure that users can quickly understand and use the map. The system should also include interactive
features, such as customizable haptic cues issued from the touch screen, allowing users to interact with
the map in a way that best suits their needs and preferences. All of these proposed features must be
implemented in accordance with the unobtrusive feedback requirement, which emphasizes minimizing
the information load provided to the user through feedback. By using tactile and auditory channels
with the least possible intrusiveness, the system ensures hands-free and ear-free solutions wherever
possible. This approach minimizes cognitive overload and allows users to navigate intuitively and
eficiently without being overwhelmed by excessive information.</p>
      <p>In Table 1, we report significant works in outdoor and indoor environment highlight when user
needs are addressed (3) or not considered (empty box). Below we provide more detail on the works
analyzed, describing strengths and possible lessons learned for achieving inclusive indoor systems.</p>
      <p>In [22], the development of an integrated system is described, in which a white cane is equipped
with two actuators in its handle to provide directional hints. The authors report findings from a
usability and user experience study conducted with blind and visually impaired persons, showing
positive participants’ perceptions and highlighting a preference for haptic guidance with two actuators.
Conversely, partially sighted participants preferred the single actuator method. These conclusions
underline the importance of vibration feedback for blind and visually impaired individuals as a useful
tool for improving orientation and providing feedback and input on user mobility, in line with the
respective orientation and mobility requirement. It is therefore essential to develop a system that is not
only lightweight and easy to wear and carry, thus satisfying the ergonomic portability requirement, but
also successfully balances system complexity with long battery life. By using lightweight materials and
an ergonomic design, the device will ensure the user’s freedom of movement. In addition, the use of low
power technologies and software optimizations allows the system to operate efectively for extended
periods without the need for frequent recharging.</p>
      <p>Similar findings are provided in [ 23]. Authors propose a real-time system for blind persons which
exploits distance camera to provide feedback through speakers and vibration from a smart watch. Here
it is important to emphasise the need for near real-time capabilities in real-world applications. This is
particularly critical in indoor environments, given the number of meters that can be walked in a short
period of time and the potential hazards posed by walls, doors, stairs and furniture. Unfortunately,
their system relies on a camera placed above the user’s head, and while in theory such solutions can
work in a real environment, in this particular use case we should consider - and strongly reafirm
the importance of providing less invasive solutions for real users. This is a key requirement for user
acceptance and widespread adoption in the real world. Meeting the ergonomic portability requirement,
the inclusive indoor localization system must be lightweight and easy to wear and transport, ensuring
that it does not restrict the user’s freedom of movement. The use of a head-mounted camera can be
perceived as cumbersome and intrusive, limiting the user’s comfort and willingness to use the device
consistently.</p>
      <p>In [24], a mobile-based wayfinding tool is presented to address the challenges faced by visually
impaired persons in navigating independently in urban environments. The system consists of a software
application for Android devices that communicates with several external components. These include a
high-precision global positioning module that continuously monitors the user’s movement, a special
device that attaches to trafic lights to determine their current state, and an ultrasonic detection unit to
identify nearby obstacles in the pedestrian’s path. This comprehensive solution aims to increase the
autonomy and safety of visually impaired persons as they navigate public spaces without relying on
human assistance.</p>
      <p>It is worth emphasizing that more mature solutions have been developed in terms of inclusivity for
outdoor environments, and this knowledge of outdoor navigation should be applied to the design of
indoor systems. These indoor environments present unique challenges where, for example, precision
and accuracy may be paramount in some areas and less critical in others. Similarly, the concentration
of potential hazards may be more pronounced in certain areas - such as near staircases - than in others.
This variability in environmental characteristics requires a nuanced approach to the design of indoor
navigation systems, one that incorporates lessons learned from outdoor solutions while addressing the
specific challenges of indoor environments.</p>
      <p>
        In [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], instead, authors propose a low-cost solution for the construction of cognitive maps for visually
impaired users in urban environments, exploiting the TTS and vibrating capabilities ofered by the
Android Operating System. Although the proposed prototype is based on GPS coordinates in outdoor
environments, the adoption of the GeoJSON (short for Geographic JavaScript Object) format for the
maps allows for no loss of generality, as the same format can also be used to describe indoor contexts.
The customization of GeoJSON metadata to provide accessible feedback tailored on the users’ needs is
described as well as other technical solutions adopted for allowing accessibility. The importance of
using a co-design strategy is emphasised by describing how target users were actively involved during
all stages of the design and development, by means of interviews and prototype testing. Gathering
feedback from these users proved to be extremely important both for defining application functionalities
and for learning the most suitable interaction modes according to specific individual needs.
      </p>
      <p>In [25, 26, 27] authors propose portable and wearable solutions for detecting obstacles and providing
navigation directions. Most of the systems mainly exploit the haptic channel so as not to overload
the auditory channel. Nonetheless, a thorough study is required to determine the level of information
encoding that one can manage to (a) provide to the user, and (b) be able to perceive from the user in an
intuitive and efective manner. Furthermore, the auditory channel is required to be left unobstructed, but
purposeful and accurate use deserves further investigation to enable systems to ofer more information
based on the user’s context and preferences. A good combination of user location strategies, knowledge
of user preferences and usage patterns, with system customization features and functionality can open
up new possible user experience scenarios in the navigation tasks. The system should be able to detect
the context and at the same time allow the user to receive the information of his or her interest with
the preferred feedback, with the level of granularity not only at the user’s request but also based on
the user experience. In some cases a haptic signal is more immediate and eficient, while in others a
verbalization of information allows for more precise and targeted content. Systems proposed in the
literature attempt to bridge some of these issues, but we are still far from efective solutions. These
systems are often unacceptable to users because they are either expensive, flashy, or inconvenient to
use, thus failing to meet requirements such as ergonomic portability, orientation and mobility, and
efortless interaction. Although some low-cost solutions have been attempted, the technology does not
yet ofer the ability to achieve the desired results at a relatively low cost, and the technology does not
yet ofer precision and accuracy, features that would be greatly needed in the case of visually impaired
persons. As devices, sensors and applications evolve, more afordable solutions for the user should be
further investigated.</p>
      <p>In [28], the authors discussed the potential use and efectiveness of an interactive map application
(i.e., OTASCE map) to provide route guidance for visually impaired users. They presented a study
with 50 visually impaired participants who successfully traced routes and reached destinations using
the map. People could reach their destination even when they momentarily lost contact with the
screen and re-established touch interaction. After the study, the authors interviewed 24 users and
received valuable feedback. Some users misinterpreted coordinates or orientation, while others found
next-direction indications inaccurate or delayed. Feedback on audio and speech levels varied, with some
users preferring vibration feedback for route deviations. These insights contribute to our understanding
of the complexities of designing inclusive mapping solutions and, more generally, inclusive navigation
solutions.</p>
      <p>Finally, although energy eficiency was not listed among the highlighted user requirements, it is a
fundamental consideration for applications that make heavy use of sensors and computational resources.
None of the articles examined addressed this aspect.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Open challenges</title>
      <p>Designing cost-efective, user-friendly, and functional assistive technology for navigation remains
challenging despite significant recent progress. Designers still need to enhance positioning systems
to ofer greater accuracy and customizable interfaces, ensuring a positive experience and tailored
information for users with special needs. An inclusive system for indoor navigation and orientation
should at least address the following requirements:
• Precision and Accuracy: Precision and accuracy in an inclusive indoor localization systems
are a significant challenge, especially when it comes to complex environments characterized by
varying signal strength and many obstacles. It is crucial that a visually impaired user be guided
correctly through the environment, keeping into account unexpected obstacles, specific points of
interest, such as information desks, and features such as stairs and elevators, which do not have
the same relevance for common users. Technologies such as LiDAR, BLE and image classification
have been employed to this purpose.
• Integration: Individuals with visual impairments mainly rely on traditional aids such as white
canes or guide dogs. It is hence important to provide technologies that can work in combination
with such aids, augmenting the perception of the surrounding environment without being
intrusive. Too many audio or haptic stimuli may in fact hinder the overall user experience.
Finding the proper cues to be issued is a challenge which can be addressed with co-design
strategies and extensive user testing.
• Human-Machine Interaction: To ensure a satisfying user experience, user interfaces should
be carefully designed with the aid of the end users and extensively tested. The users’ needs
should be at first gathered through interviews, surveys or focus groups. Then, users should be
actively involved in identifying the most suitable strategies to convey information. Information
are typically provided to visually impaired persons via the audio or haptic channels. In mobile
devices, TTS and accessibility services (such as Android’s TalkBack) are extensively used to
provide verbal hints, but also simple sounds can be used to identify precise events, such as the
proximity of a point of interest. The haptic channel can be adopted in many diferent ways, as
a means to guide the visually impaired user towards a certain direction (typically in wearable
devices) or to signal the occurrence of an event, in which case patterns of vibrations can be
used (e.g., diferent patterns may indicate diferent points of interest). Since many kinds of
visual impairment exist and each user has their own peculiarities, it is important that both the
information to be provided and the ways in which hints are issued be customizable according to
the end user’s unique preferences.
• Awareness and Trustworthiness: Security and robustness are important features which
become even more relevant in systems designed for persons with special needs. Both software
and hardware need to be extensively tested, and AI techniques (e.g., image classification and
recognition) can be employed to avoid the disclosure of sensitive data.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusion</title>
      <p>This paper highlights the growing interest in developing inclusive localization and positioning systems
that can be used by diverse types of users. The main challenge is to determine how this inclusivity
can be achieved. While recent developments in localization systems have focused on aspects such
as precision, accuracy, and privacy, now that technological solutions appear to be robust enough for
market entry and mass distribution, we believe it is essential not to neglect the requirements and needs
of all types of users. Our work aims to outline the key requirements for designing inclusive systems and
identify the next steps that the scientific community should take to achieve this goal. As technological
solutions become more sophisticated, it is crucial to ensure that their benefits are accessible to users
with diferent needs and abilities.
[9] W. R. Wiener, R. L. Welsh, B. B. Blasch, Foundations of orientation and mobility, volume 1, American</p>
      <p>Foundation for the Blind, 2010.
[10] C. Prandi, G. Delnevo, P. Salomoni, S. Mirri, On supporting university communities in indoor
wayfinding: An inclusive design approach, Sensors 21 (2021) 3134.
[11] L. Ottink, H. Buimer, B. van Raalte, C. F. Doeller, T. M. van der Geest, R. J. van Wezel, Cognitive map
formation supported by auditory, haptic, and multimodal information in persons with blindness,
Neuroscience &amp; Biobehavioral Reviews 140 (2022).
[12] D. Plikynas, A. Žvironas, A. Budrionis, M. Gudauskis, Indoor navigation systems for visually
impaired persons: Mapping the features of existing technologies to user needs, Sensors 20 (2020).
[13] C. Wang, Y. Chen, S. Zheng, H. Liao, Gender and age diferences in using indoor maps for
wayfinding in real environments, ISPRS International Journal of Geo-Information 8 (2018).
[14] D. Ahmetovic, C. Gleason, C. Ruan, K. Kitani, H. Takagi, C. Asakawa, Navcog: a navigational
cognitive assistant for the blind, in: Proceedings of the 18th International Conference on
HumanComputer Interaction with Mobile Devices and Services, 2016, pp. 90–99.
[15] D. Plikynas, A. Indriulionis, A. Laukaitis, L. Sakalauskas, Indoor-guided navigation for people
who are blind: Crowdsourcing for route mapping and assistance, Applied Sciences 12 (2022).
[16] J. Madake, S. Bhatlawande, A. Solanke, S. Shilaskar, A qualitative and quantitative analysis of
research in mobility technologies for visually impaired people, IEEE Access (2023).
[17] J. Yan, S. Zlatanova, A. Diakité, A unified 3d space-based navigation model for seamless navigation
in indoor and outdoor, International Journal of Digital Earth 14 (2021) 985–1003.
[18] F. Furfari, A. Crivello, P. Barsocchi, F. Palumbo, F. Potortì, What is next for indoor localisation?
taxonomy, protocols, and patterns for advanced location based services, in: 2019 International
Conference on Indoor Positioning and Indoor Navigation (IPIN), 2019, pp. 1–8.
[19] F. E.-Z. El-Taher, A. Taha, J. Courtney, S. Mckeever, A systematic review of urban navigation
systems for visually impaired people, Sensors 21 (2021).
[20] M. Miao, M. Spindler, G. Weber, Requirements of indoor navigation system from blind users, in:
Information Quality in e-Health: 7th Conference of the Workgroup Human-Computer Interaction
and Usability Engineering of the Austrian Computer Society, USAB 2011, Graz, Austria, November
25-26, 2011. Proceedings 7, Springer, 2011, pp. 673–679.
[21] M. Wang, A. Dommes, V. Renaudin, N. Zhu, Analysis of spatial landmarks for seamless urban
navigation of visually impaired people, IEEE Journal of Indoor and Seamless Positioning and
Navigation 1 (2023) 93–103.
[22] B. Chaudary, S. Pohjolainen, S. Aziz, L. Arhippainen, P. Pulli, Teleguidance-based remote navigation
assistance for visually impaired and blind people—usability and user experience, Virtual Reality
27 (2023) 141–158.
[23] Z. Chen, X. Liu, M. Kojima, Q. Huang, T. Arai, A wearable navigation device for visually impaired
people based on the real-time semantic visual slam system, Sensors 21 (2021).
[24] P. Theodorou, K. Tsiligkos, A. Meliones, C. Filios, An extended usability and ux evaluation of
a mobile application for the navigation of individuals with blindness and visual impairments
outdoors—an evaluation framework based on training, Sensors 22 (2022).
[25] M. Afif, R. Ayachi, E. Pissaloux, Y. Said, M. Atri, Indoor objects detection and recognition for an
ict mobility assistance of visually impaired people, Multimedia Tools and Applications 79 (2020).
[26] V. Nair, G. Olmschenk, W. H. Seiple, Z. Zhu, Assist: Evaluating the usability and performance of
an indoor navigation assistant for blind and visually impaired people, Assistive Technology 34
(2022) 289–299.
[27] K. Müller, C. Engel, C. Loitsch, R. Stiefelhagen, G. Weber, Traveling more independently: a study
on the diverse needs and challenges of people with visual or mobility impairments in unfamiliar
indoor environments, ACM Transactions on Accessible Computing (TACCESS) (2022) 1–44.
[28] M. Matsuo, T. Miura, R. Ichikari, K. Kato, T. Kurata, Tracing interaction on otasce map by the
visually impaired: Feasibility of adopting interactive route guidance, in: 2022 IEEE International
Conference on Systems, Man, and Cybernetics (SMC), IEEE, 2022, pp. 1548–1553.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>M.</given-names>
            <surname>Hegarty</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Waller</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Shah</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Miyake</surname>
          </string-name>
          , The cambridge handbook of visuospatial thinking,
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>M. T.</given-names>
            <surname>Paratore</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Leporini</surname>
          </string-name>
          ,
          <article-title>Exploiting the haptic and audio channels to improve orientation and mobility apps for the visually impaired</article-title>
          ,
          <source>Universal Access in the Information Society</source>
          (
          <year>2023</year>
          )
          <fpage>1</fpage>
          -
          <lpage>11</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>M.</given-names>
            <surname>Ballantyne</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Jha</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Jacobsen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. S.</given-names>
            <surname>Hawker</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y. N.</given-names>
            <surname>El-Glaly</surname>
          </string-name>
          ,
          <article-title>Study of accessibility guidelines of mobile applications</article-title>
          ,
          <source>in: Proceedings of the 17th international conference on mobile and ubiquitous multimedia</source>
          ,
          <year>2018</year>
          , pp.
          <fpage>305</fpage>
          -
          <lpage>315</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>S.</given-names>
            <surname>Real</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Araujo</surname>
          </string-name>
          ,
          <article-title>Navigation systems for the blind and visually impaired: Past work, challenges, and open problems</article-title>
          ,
          <source>Sensors</source>
          <volume>19</volume>
          (
          <year>2019</year>
          )
          <fpage>3404</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>A.</given-names>
            <surname>Khan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Khusro</surname>
          </string-name>
          ,
          <article-title>An insight into smartphone-based assistive solutions for visually impaired and blind people: issues, challenges and opportunities</article-title>
          ,
          <source>Universal Access in the Information Society</source>
          <volume>20</volume>
          (
          <year>2021</year>
          )
          <fpage>265</fpage>
          -
          <lpage>298</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>A. S.</given-names>
            <surname>Martinez-Sala</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Losilla</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. C.</given-names>
            <surname>Sánchez-Aarnoutse</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>García-Haro</surname>
          </string-name>
          ,
          <article-title>Design, implementation and evaluation of an indoor navigation system for visually impaired people</article-title>
          ,
          <source>Sensors</source>
          <volume>15</volume>
          (
          <year>2015</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>S.</given-names>
            <surname>Subedi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.-Y.</given-names>
            <surname>Pyun</surname>
          </string-name>
          ,
          <article-title>A survey of smartphone-based indoor positioning system using rf-based wireless technologies</article-title>
          ,
          <source>Sensors</source>
          <volume>20</volume>
          (
          <year>2020</year>
          )
          <fpage>7230</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>K.</given-names>
            <surname>Jivrajani</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. K.</given-names>
            <surname>Patel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Parmar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Surve</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Ahmed</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F. M.</given-names>
            <surname>Bui</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F. A.</given-names>
            <surname>Al-Zahrani</surname>
          </string-name>
          ,
          <article-title>Aiot-based smart stick for visually impaired person</article-title>
          ,
          <source>IEEE Transactions on Instrumentation and Measurement</source>
          <volume>72</volume>
          (
          <year>2022</year>
          )
          <fpage>1</fpage>
          -
          <lpage>11</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>