Inclusive Navigation Systems: Perspectives and Challenges for the Visually-Impaired Dimitri Belli1,∗,† , Paolo Barsocchi1,† , Antonino Crivello1,† , Francesco Furfari1,† , Barbara Leporini1,2,† and Maria Teresa Paratore1,∗,† 1 Institute of Information Science and Technologies (ISTI), National Research Council (CNR), Via G. Moruzzi 1, Pisa, Italy 2 Department of Computer Science, University of Pisa, Largo B. Pontecorvo 3, Pisa, Italy Abstract Despite significant advances in technology, the area of mobility and orientation for visually impaired persons continues to present significant challenges. Digital maps have become essential for navigation, but their usability is often compromised for users who rely on assistive technologies, especially when accessed on small touch screens. This calls for innovative approaches to making digital maps more accessible and usable, as these tools are crucial for creating mental maps of navigational spaces. This paper explores the need for inclusive localization and positioning systems that accommodate a wide range of users, including those with visual impairments. It highlights the critical role of user context, such as device experience and positional awareness, in improving the usability of these systems. The integration of haptic and audio feedback may offer promising new interaction methods, although further development is needed. In addition, user interface design and system characteristics such as security, robustness and usability need to be aligned with user acceptance, with a focus on low cost and simplicity. Our analysis identifies key requirements for the design of inclusive systems and proposes steps for the scientific community to take to advance the field, with the aim of bridging the gap between technological capabilities and practical usability, and promoting inclusive design principles for future innovation. Keywords Inclusivity, Indoor localization, Orientation and mobility, Accessibility, Blind and visually impaired, Navigation aids 1. Introduction The terms “orientation” and ”mobility” are related to the concepts of wayfinding and locomotion, respectively, and identify the fundamental components of spatial decision making [1]. Good ’orientation’ and ’mobility’ skills are very important for a visually impaired person who moves independently. Orientation refers to the skill of planning a route from the current position to a given destination, thus requiring knowledge of the whole area and its relevant landmarks, in a mental representation of the environment (a.k.a. ”cognitive map”) [2]. Mobility, on the other hand, involves all the actions that an individual takes in direct response to the physical characteristics of the environment as he or she traverses it (e.g. avoiding an obstacle, descending or ascending stairs), and thus depends only on the perception of the immediate environment at a given moment. Knowing in advance the structure of a new indoor or outdoor environment and where its points of interest are located can help visually impaired visitors find their way around and quickly reach their destination. Tools such as tactile maps, 3D physical models and accurate text descriptions help users to build a cognitive map of the space to be explored. A preliminary physical exploration, either with the help of a person or relying solely on the traditional white cane, is also a valuable aid. Proceedings of the Work-in-Progress Papers at the 14th International Conference on Indoor Positioning and Indoor Navigation (IPIN-WiP 2024) ∗ Corresponding authors. † These authors contributed equally. Envelope-Open dimitri.belli@isti.cnr.it (D. Belli); paolo.barsocchi@isti.cnr.it (P. Barsocchi); antonino.crivello@isti.cnr.it (A. Crivello); francesco.furfari@isti.cnr.it (F. Furfari); barbara.leporini@di.unipi.it (B. Leporini); mariateresa.paratore@isti.cnr.it (M. T. Paratore) Orcid 0000-0003-1491-6450 (D. Belli); 0000-0002-6862-7593 (P. Barsocchi); 0000-0001-7238-2181 (A. Crivello); 0000-0002-4957-828X (F. Furfari); 0000-0003-2469-9648 (B. Leporini); 0000-0002-9089-8445 (M. T. Paratore) © 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR ceur-ws.org Workshop ISSN 1613-0073 Proceedings Nowadays, most public spaces provide visitors with navigation tools, such as digital signs, websites, or mobile applications. Unfortunately, these tools often pose accessibility problems for visitors with special needs. For example, digital maps integrated into mobile applications typically lack features to support accessibility for persons with visual impairments [3], yet, accessible navigation mobile apps have proven to be effective assistive solutions for this category of user [4]. In [5] are discussed the daily challenges faced by visually impaired persons and pointed out how ICT, and more specifically assistive technologies, can help them achieve greater social inclusion and autonomy. Designing an application for visually impaired persons requires special hardware infrastructure and positioning techniques, adapted to the characteristics of the environment (e.g. wall thickness and geometry) and the specific needs of the users [6]. Commonly adopted infrastructures include Wi-Fi, Radio-frequency identification (RFID), Bluetooth Low Energy (BLE)1 , Long Range (LoRa) communication, and Ultra Wide Band (UWB), sometimes combined in hybrid systems [7]. Recently, the growing popularity of AI algorithms, has led to their integration into several assistive solutions in the field23 . In particular, deep-learning algorithms for image recognition have been used to empower the traditional white cane [8] to facilitate object recognition and obstacle avoidance, or integrated into mobile apps to simplify wayfinding and orientation tasks4 . Impairments can pose challenges to the accessibility of human-machine interfaces potentially impacting the overall user experience [5]. In a mobile application, user interfaces need to be carefully designed to prevent accessibility issues; for instance, avoiding pop-ups is critical when creating GUIs for visually impaired users. Personalization of information (i.e., choosing which information to present and how to deliver it) is another aspect that must be considered. Visually impaired persons commonly use non-visual channels to perceive their surroundings [9]. While exploring an environment, blind subjects usually rely on the sounds they hear and the obstacles they discover using the white cane, in order to identify and remember landmarks, and to better orient themselves [2, 10]. Other signals such as smells can also be associated with landmarks, and may become valuable aids for orientation. Moreover, blind persons are not inclined to rely on assistive technology exclusively, as they do not want to give up the traditional white cane to move around more safely. On the other hand, partially sighted individuals may have a different approach, as they can make use of their residual vision. Besides these general considerations, each individual has their own unique preferences influenced by factors like age, gender and personal experiences [11]. For example, not all visually impaired subjects are equally responsive when it comes to the sense of touch, depending on their age or familiarity with tactile devices. For an orientation and mobility system to be effective, it is not only necessary to have a robust software and hardware infrastructure, but also a properly designed user experience for the related software applications and devices. To this end, the adoption of co-design strategies in a preliminary phase may prove fundamental [12, 13]. As the performances of smartphones have increased, they have gained popularity as tools to enhance the independence of individuals with visual impairments. Mobile applications also benefit from the large distribution capacity provided by app stores on the Internet. Thanks to features such as vibration motors, text-to-speech (TTS) and accessibility services, smartphones offer cost-effective solutions for providing information through tactile and auditory channels [14]. However, in many cases, audio alerts can be intrusive or difficult to hear in noisy environments. In addition, the vibrations provided by a smartphone may not be sufficient to guide a user through a complex physical environment, and visually impaired users may find it cumbersome to hold a smartphone in one hand and a guide dog or white cane in the other. There are many assistive devices designed to overcome these problems, with software and hardware tailored to the computational demands of traversal and localization problems. Some of these devices are designed as stand-alone modules, while others can be integrated by mobile apps. Their rather high cost is, however, a limiting factor to their diffusion. Examples of such devices include smart glasses and other wearables based on the haptic channel. Assistive smart glasses exist, 1 The official iBeacon documentation: https://developer.apple.com/ibeacon/ 2 Seeing AI: https://www.seeingai.com/ 3 Be my AI: https://www.bemyeyes.com/blog/introducing-be-my-ai 4 VoiceVista: https://www.applevis.com/apps/ios/navigation/voicevista which may also come equipped with bone conduction headphones to reduce the burden of auditory stimuli, or integrated with software libraries for augmented reality or artificial intelligence.5 Assistive wearable belts are another type of hands-free device, consisting of a series of actuators6 , which are capable of transmitting fine-grained directional details. This work highlights several key issues related to the development of assistive navigation applications for visually impaired persons, issues that should receive more attention from the scientific community. The articles we have identified and discussed have been carefully selected to underscore specific aspects such as user requirements, user experience, usability, and accessibility of assistive navigation devices. While this work does not aim to be an exhaustive or systematic review of the literature on the subject, it serves as a compass, a preliminary effort designed to guide and inspire future, in-depth studies that will expand our understanding and improve the development of these technologies. The remainder of the paper is organized as follows: Section II outlines the key requirements for defining a usability-centered and inclusive assistive navigation solution. Section III connects these requirements to the context of indoor localization. Finally, Section IV and V respectively summarize the main open challenges in this area and draw conclusions. 2. Understanding user needs: Requirements for an inclusive experience Designing an assistive device for users with special needs implies focusing on their expectations, in order to develop hardware and software solutions which account for an effective and satisfactory user experience. In[15], the authors conducted a survey involving twenty-five blind users, and highlighted several important observations: blind persons prefer interactive tactile maps to navigate towards distant destinations or specific nearby objects, such as elevators, stairs, entrances, room numbers, and exits. Additionally, the survey found that blind users favor sound-guided navigation outdoors and tactile navigation indoors. Another notable outcome of this survey is that blind individuals often face challenges in locating the correct room and finding elevators or stairs indoors. The latter is qualitatively important for applications designed to help visually impaired individuals move independently [16]. Key specifications for these applications include: 1. Minimizing the information load provided to the user through feedback, using auditory and tactile channels with the least intrusiveness possible, hopefully considering hands-free and ears-free solutions. 2. Ensuring that the device is lightweight, easy to wear and transport, without restricting the user’s freedom of movement. 3. Balancing the complexity of the system with a long battery life to ensure effective operation over a long period. 4. Designing intuitive interfaces that require a short learning curve to minimize the effort necessary to understand and use the device. 5. Ensuring safety, security, and reliability, so as to establish a sense of trustworthiness in the user, rather than anxiety towards technology. For this reason, privacy and security aspects related to sensitive information must be considered thoroughly. The literature commonly distinguishes between outdoor and indoor navigation systems, each ad- dressing specific and distinct user requirements. While solutions for seamless navigation between these environments do exist, they are still in the early stages, with many challenges yet to be addressed, and nothing specific on the topic [17, 18]. As reported in [19], outdoor urban navigation systems typically involve three main tasks: environ- mental mapping, route planning, and real-time navigation. User requirements for these tasks vary. 5 IrisVision:https://irisvision.com/electronic-glasses-for-the-blind-and-visually-impaired/ 6 FeelSpace: https://feelspace.de/en/ User Requirements Compass (orientation) Haptic and Route planning Audio feedback Pedestrian routing Building mapping Indoor Navigation Outdoor Navigation Public transportation routing Pedestrian traffic light Door and Window detection localization Sidewalk detection Staircase and Lift/ Elevator localization Crosswalk detection Zebra detection Obstacle avoidance Localization Multiple input channel Multi-device support (haptic and audio) (portability) Figure 1: An overview of the main user requirements for indoor and outdoor navigation solutions developed for assisting blind and visually impaired persons. For environmental mapping, they include identifying road junctions, traffic lights, pavements, zebra crossings, and transport options. For route planning, they include selecting locations and routes, and choosing the type of path (pedestrian or public transport). For real-time navigation, requirements range from understanding the environment through contextual information (e.g., distance to destination, travel time and so forth) to identifying obstacles. User requirements for indoor navigation systems for blind and visually impaired persons, instead, can be categorized into functional and usability require- ments, and route description requirements [20]. Concerning functional and usability requirements, blind and visually impaired users prioritize a navigation support system with functionalities like route planning across devices (mobile and desktop), saving destinations for later use, and receiving current location information with detailed descriptions of point-of-interest and environment facilities like stairs, elevators, toilets, coffee and snack vending machines. They also require clear feedback on inputs, advanced obstacle warnings (tactile and audible), route retracement options, wrong-turn alerts, and the ability to store and share points of interest. As for route descriptions, personalization, concise summaries, functional waypoint markers, and integration of environmental cues (tactile, auditory, olfactory) are crucial. While mobile devices are preferred for navigation, some users value desktop planning for its ease of information input. Fig.1 summarizes the main user requirements presented above for indoor and outdoor navigation solutions developed for assisting blind and visually impaired persons. Although addressing both outdoor and indoor environments, this work concentrates specifically on user requirements for indoor localization. 3. Translating user requirements in indoor localization context To deploy an inclusive localization system by design, we need to address all the requirements outlined in the previous section. Our proposal for an inclusive indoor localization system focuses on maximizing the accuracy of the user’s location at the room level and precisely identifying transition areas such as elevators and stairs, using interactive and tactile maps rather than relying solely on sound-guided navigation. Our vision emphasizes the importance of identifying landmarks as easily recognizable objects that serve as external reference points. Elevators and stairs are primary examples of these landmarks, but other objects defined within a tactile map can provide valuable information throughout the wayfinding and navigation process. This approach is consistent with recent findings in [21], which identified a set of landmarks for outdoor scenarios. Consequently, our maps are designed to help users locate landmarks and rooms through tactile means. By integrating these features, our proposal for a localization system aims to provide a highly accurate and user-friendly solution. This ensures that visually impaired and blind persons can navigate indoor environments with greater independence and confidence. Importantly, this approach addresses the privacy awareness and trustworthiness requirement by providing a system that is not only reliable and secure, but also fosters trust in the technology. The use of tactile maps enhances the user experience by making navigation intuitive and reducing the anxiety related to technology use. Moreover, the design of the system prioritizes privacy and the secure handling of sensitive information, further contributing to a safe and trustworthy navigation aid. We propose that the inclusive maps should be primarily tactile, produced for key areas to allow blind users to feel their way around the map. These maps should be enriched to work seamlessly with screen readers, providing audio descriptions of landmarks where appropriate, while prioritizing the tactile cues. Users can receive spoken instructions and descriptions of their surroundings to help them navigate. The color schemes used for the digital maps should emphasize high contrast to ensure that users with low vision can distinguish between different elements. Important text and symbols should be displayed in larger fonts to improve readability. Map layout should be kept deliberately simple, focusing on essential routes and landmarks. Non-essential details should be minimized to avoid confusion and ensure that users can quickly understand and use the map. The system should also include interactive features, such as customizable haptic cues issued from the touch screen, allowing users to interact with the map in a way that best suits their needs and preferences. All of these proposed features must be implemented in accordance with the unobtrusive feedback requirement, which emphasizes minimizing the information load provided to the user through feedback. By using tactile and auditory channels with the least possible intrusiveness, the system ensures hands-free and ear-free solutions wherever possible. This approach minimizes cognitive overload and allows users to navigate intuitively and efficiently without being overwhelmed by excessive information. In Table 1, we report significant works in outdoor and indoor environment highlight when user needs are addressed (3) or not considered (empty box). Below we provide more detail on the works analyzed, describing strengths and possible lessons learned for achieving inclusive indoor systems. In [22], the development of an integrated system is described, in which a white cane is equipped with two actuators in its handle to provide directional hints. The authors report findings from a usability and user experience study conducted with blind and visually impaired persons, showing positive participants’ perceptions and highlighting a preference for haptic guidance with two actuators. Conversely, partially sighted participants preferred the single actuator method. These conclusions underline the importance of vibration feedback for blind and visually impaired individuals as a useful tool for improving orientation and providing feedback and input on user mobility, in line with the respective orientation and mobility requirement. It is therefore essential to develop a system that is not only lightweight and easy to wear and carry, thus satisfying the ergonomic portability requirement, but also successfully balances system complexity with long battery life. By using lightweight materials and an ergonomic design, the device will ensure the user’s freedom of movement. In addition, the use of low power technologies and software optimizations allows the system to operate effectively for extended periods without the need for frequent recharging. Similar findings are provided in [23]. Authors propose a real-time system for blind persons which exploits distance camera to provide feedback through speakers and vibration from a smart watch. Here it is important to emphasise the need for near real-time capabilities in real-world applications. This is particularly critical in indoor environments, given the number of meters that can be walked in a short period of time and the potential hazards posed by walls, doors, stairs and furniture. Unfortunately, their system relies on a camera placed above the user’s head, and while in theory such solutions can work in a real environment, in this particular use case we should consider - and strongly reaffirm - the importance of providing less invasive solutions for real users. This is a key requirement for user acceptance and widespread adoption in the real world. Meeting the ergonomic portability requirement, Table 1 User requirements in representative indoor and outdoor solutions (empty box means no/absent, 3 means yes/present). Privacy Orientation Indoor (I) Feedback Ergonomic Effortless awareness Ref. and Outdoor (O) optimization portability interaction and Mobility trustworthiness [21] O 3 3 [22] I&O 3 3 3 [23] I 3 [24] O 3 3 3 [2] O 3 3 3 [25] I [26] I 3 3 [27] I 3 [28] O 3 3 3 the inclusive indoor localization system must be lightweight and easy to wear and transport, ensuring that it does not restrict the user’s freedom of movement. The use of a head-mounted camera can be perceived as cumbersome and intrusive, limiting the user’s comfort and willingness to use the device consistently. In [24], a mobile-based wayfinding tool is presented to address the challenges faced by visually impaired persons in navigating independently in urban environments. The system consists of a software application for Android devices that communicates with several external components. These include a high-precision global positioning module that continuously monitors the user’s movement, a special device that attaches to traffic lights to determine their current state, and an ultrasonic detection unit to identify nearby obstacles in the pedestrian’s path. This comprehensive solution aims to increase the autonomy and safety of visually impaired persons as they navigate public spaces without relying on human assistance. It is worth emphasizing that more mature solutions have been developed in terms of inclusivity for outdoor environments, and this knowledge of outdoor navigation should be applied to the design of indoor systems. These indoor environments present unique challenges where, for example, precision and accuracy may be paramount in some areas and less critical in others. Similarly, the concentration of potential hazards may be more pronounced in certain areas - such as near staircases - than in others. This variability in environmental characteristics requires a nuanced approach to the design of indoor navigation systems, one that incorporates lessons learned from outdoor solutions while addressing the specific challenges of indoor environments. In [2], instead, authors propose a low-cost solution for the construction of cognitive maps for visually impaired users in urban environments, exploiting the TTS and vibrating capabilities offered by the Android Operating System. Although the proposed prototype is based on GPS coordinates in outdoor environments, the adoption of the GeoJSON (short for Geographic JavaScript Object) format for the maps allows for no loss of generality, as the same format can also be used to describe indoor contexts. The customization of GeoJSON metadata to provide accessible feedback tailored on the users’ needs is described as well as other technical solutions adopted for allowing accessibility. The importance of using a co-design strategy is emphasised by describing how target users were actively involved during all stages of the design and development, by means of interviews and prototype testing. Gathering feedback from these users proved to be extremely important both for defining application functionalities and for learning the most suitable interaction modes according to specific individual needs. In [25, 26, 27] authors propose portable and wearable solutions for detecting obstacles and providing navigation directions. Most of the systems mainly exploit the haptic channel so as not to overload the auditory channel. Nonetheless, a thorough study is required to determine the level of information encoding that one can manage to (a) provide to the user, and (b) be able to perceive from the user in an intuitive and effective manner. Furthermore, the auditory channel is required to be left unobstructed, but purposeful and accurate use deserves further investigation to enable systems to offer more information based on the user’s context and preferences. A good combination of user location strategies, knowledge of user preferences and usage patterns, with system customization features and functionality can open up new possible user experience scenarios in the navigation tasks. The system should be able to detect the context and at the same time allow the user to receive the information of his or her interest with the preferred feedback, with the level of granularity not only at the user’s request but also based on the user experience. In some cases a haptic signal is more immediate and efficient, while in others a verbalization of information allows for more precise and targeted content. Systems proposed in the literature attempt to bridge some of these issues, but we are still far from effective solutions. These systems are often unacceptable to users because they are either expensive, flashy, or inconvenient to use, thus failing to meet requirements such as ergonomic portability, orientation and mobility, and effortless interaction. Although some low-cost solutions have been attempted, the technology does not yet offer the ability to achieve the desired results at a relatively low cost, and the technology does not yet offer precision and accuracy, features that would be greatly needed in the case of visually impaired persons. As devices, sensors and applications evolve, more affordable solutions for the user should be further investigated. In [28], the authors discussed the potential use and effectiveness of an interactive map application (i.e., OTASCE map) to provide route guidance for visually impaired users. They presented a study with 50 visually impaired participants who successfully traced routes and reached destinations using the map. People could reach their destination even when they momentarily lost contact with the screen and re-established touch interaction. After the study, the authors interviewed 24 users and received valuable feedback. Some users misinterpreted coordinates or orientation, while others found next-direction indications inaccurate or delayed. Feedback on audio and speech levels varied, with some users preferring vibration feedback for route deviations. These insights contribute to our understanding of the complexities of designing inclusive mapping solutions and, more generally, inclusive navigation solutions. Finally, although energy efficiency was not listed among the highlighted user requirements, it is a fundamental consideration for applications that make heavy use of sensors and computational resources. None of the articles examined addressed this aspect. 4. Open challenges Designing cost-effective, user-friendly, and functional assistive technology for navigation remains challenging despite significant recent progress. Designers still need to enhance positioning systems to offer greater accuracy and customizable interfaces, ensuring a positive experience and tailored information for users with special needs. An inclusive system for indoor navigation and orientation should at least address the following requirements: • Precision and Accuracy: Precision and accuracy in an inclusive indoor localization systems are a significant challenge, especially when it comes to complex environments characterized by varying signal strength and many obstacles. It is crucial that a visually impaired user be guided correctly through the environment, keeping into account unexpected obstacles, specific points of interest, such as information desks, and features such as stairs and elevators, which do not have the same relevance for common users. Technologies such as LiDAR, BLE and image classification have been employed to this purpose. • Integration: Individuals with visual impairments mainly rely on traditional aids such as white canes or guide dogs. It is hence important to provide technologies that can work in combination with such aids, augmenting the perception of the surrounding environment without being intrusive. Too many audio or haptic stimuli may in fact hinder the overall user experience. Finding the proper cues to be issued is a challenge which can be addressed with co-design strategies and extensive user testing. • Human-Machine Interaction: To ensure a satisfying user experience, user interfaces should be carefully designed with the aid of the end users and extensively tested. The users’ needs should be at first gathered through interviews, surveys or focus groups. Then, users should be actively involved in identifying the most suitable strategies to convey information. Information are typically provided to visually impaired persons via the audio or haptic channels. In mobile devices, TTS and accessibility services (such as Android’s TalkBack) are extensively used to provide verbal hints, but also simple sounds can be used to identify precise events, such as the proximity of a point of interest. The haptic channel can be adopted in many different ways, as a means to guide the visually impaired user towards a certain direction (typically in wearable devices) or to signal the occurrence of an event, in which case patterns of vibrations can be used (e.g., different patterns may indicate different points of interest). Since many kinds of visual impairment exist and each user has their own peculiarities, it is important that both the information to be provided and the ways in which hints are issued be customizable according to the end user’s unique preferences. • Awareness and Trustworthiness: Security and robustness are important features which be- come even more relevant in systems designed for persons with special needs. Both software and hardware need to be extensively tested, and AI techniques (e.g., image classification and recognition) can be employed to avoid the disclosure of sensitive data. 5. Conclusion This paper highlights the growing interest in developing inclusive localization and positioning systems that can be used by diverse types of users. The main challenge is to determine how this inclusivity can be achieved. While recent developments in localization systems have focused on aspects such as precision, accuracy, and privacy, now that technological solutions appear to be robust enough for market entry and mass distribution, we believe it is essential not to neglect the requirements and needs of all types of users. Our work aims to outline the key requirements for designing inclusive systems and identify the next steps that the scientific community should take to achieve this goal. As technological solutions become more sophisticated, it is crucial to ensure that their benefits are accessible to users with different needs and abilities. References [1] M. Hegarty, D. Waller, P. Shah, A. Miyake, The cambridge handbook of visuospatial thinking, 2005. [2] M. T. Paratore, B. Leporini, Exploiting the haptic and audio channels to improve orientation and mobility apps for the visually impaired, Universal Access in the Information Society (2023) 1–11. [3] M. Ballantyne, A. Jha, A. Jacobsen, J. S. Hawker, Y. N. El-Glaly, Study of accessibility guidelines of mobile applications, in: Proceedings of the 17th international conference on mobile and ubiquitous multimedia, 2018, pp. 305–315. [4] S. Real, A. Araujo, Navigation systems for the blind and visually impaired: Past work, challenges, and open problems, Sensors 19 (2019) 3404. [5] A. Khan, S. Khusro, An insight into smartphone-based assistive solutions for visually impaired and blind people: issues, challenges and opportunities, Universal Access in the Information Society 20 (2021) 265–298. [6] A. S. Martinez-Sala, F. Losilla, J. C. Sánchez-Aarnoutse, J. García-Haro, Design, implementation and evaluation of an indoor navigation system for visually impaired people, Sensors 15 (2015). [7] S. Subedi, J.-Y. Pyun, A survey of smartphone-based indoor positioning system using rf-based wireless technologies, Sensors 20 (2020) 7230. [8] K. Jivrajani, S. K. Patel, C. Parmar, J. Surve, K. Ahmed, F. M. Bui, F. A. Al-Zahrani, Aiot-based smart stick for visually impaired person, IEEE Transactions on Instrumentation and Measurement 72 (2022) 1–11. [9] W. R. Wiener, R. L. Welsh, B. B. Blasch, Foundations of orientation and mobility, volume 1, American Foundation for the Blind, 2010. [10] C. Prandi, G. Delnevo, P. Salomoni, S. Mirri, On supporting university communities in indoor wayfinding: An inclusive design approach, Sensors 21 (2021) 3134. [11] L. Ottink, H. Buimer, B. van Raalte, C. F. Doeller, T. M. van der Geest, R. J. van Wezel, Cognitive map formation supported by auditory, haptic, and multimodal information in persons with blindness, Neuroscience & Biobehavioral Reviews 140 (2022). [12] D. Plikynas, A. Žvironas, A. Budrionis, M. Gudauskis, Indoor navigation systems for visually impaired persons: Mapping the features of existing technologies to user needs, Sensors 20 (2020). [13] C. Wang, Y. Chen, S. Zheng, H. Liao, Gender and age differences in using indoor maps for wayfinding in real environments, ISPRS International Journal of Geo-Information 8 (2018). [14] D. Ahmetovic, C. Gleason, C. Ruan, K. Kitani, H. Takagi, C. Asakawa, Navcog: a navigational cognitive assistant for the blind, in: Proceedings of the 18th International Conference on Human- Computer Interaction with Mobile Devices and Services, 2016, pp. 90–99. [15] D. Plikynas, A. Indriulionis, A. Laukaitis, L. Sakalauskas, Indoor-guided navigation for people who are blind: Crowdsourcing for route mapping and assistance, Applied Sciences 12 (2022). [16] J. Madake, S. Bhatlawande, A. Solanke, S. Shilaskar, A qualitative and quantitative analysis of research in mobility technologies for visually impaired people, IEEE Access (2023). [17] J. Yan, S. Zlatanova, A. Diakité, A unified 3d space-based navigation model for seamless navigation in indoor and outdoor, International Journal of Digital Earth 14 (2021) 985–1003. [18] F. Furfari, A. Crivello, P. Barsocchi, F. Palumbo, F. Potortì, What is next for indoor localisation? taxonomy, protocols, and patterns for advanced location based services, in: 2019 International Conference on Indoor Positioning and Indoor Navigation (IPIN), 2019, pp. 1–8. [19] F. E.-Z. El-Taher, A. Taha, J. Courtney, S. Mckeever, A systematic review of urban navigation systems for visually impaired people, Sensors 21 (2021). [20] M. Miao, M. Spindler, G. Weber, Requirements of indoor navigation system from blind users, in: Information Quality in e-Health: 7th Conference of the Workgroup Human-Computer Interaction and Usability Engineering of the Austrian Computer Society, USAB 2011, Graz, Austria, November 25-26, 2011. Proceedings 7, Springer, 2011, pp. 673–679. [21] M. Wang, A. Dommes, V. Renaudin, N. Zhu, Analysis of spatial landmarks for seamless urban navigation of visually impaired people, IEEE Journal of Indoor and Seamless Positioning and Navigation 1 (2023) 93–103. [22] B. Chaudary, S. Pohjolainen, S. Aziz, L. Arhippainen, P. Pulli, Teleguidance-based remote navigation assistance for visually impaired and blind people—usability and user experience, Virtual Reality 27 (2023) 141–158. [23] Z. Chen, X. Liu, M. Kojima, Q. Huang, T. Arai, A wearable navigation device for visually impaired people based on the real-time semantic visual slam system, Sensors 21 (2021). [24] P. Theodorou, K. Tsiligkos, A. Meliones, C. Filios, An extended usability and ux evaluation of a mobile application for the navigation of individuals with blindness and visual impairments outdoors—an evaluation framework based on training, Sensors 22 (2022). [25] M. Afif, R. Ayachi, E. Pissaloux, Y. Said, M. Atri, Indoor objects detection and recognition for an ict mobility assistance of visually impaired people, Multimedia Tools and Applications 79 (2020). [26] V. Nair, G. Olmschenk, W. H. Seiple, Z. Zhu, Assist: Evaluating the usability and performance of an indoor navigation assistant for blind and visually impaired people, Assistive Technology 34 (2022) 289–299. [27] K. Müller, C. Engel, C. Loitsch, R. Stiefelhagen, G. Weber, Traveling more independently: a study on the diverse needs and challenges of people with visual or mobility impairments in unfamiliar indoor environments, ACM Transactions on Accessible Computing (TACCESS) (2022) 1–44. [28] M. Matsuo, T. Miura, R. Ichikari, K. Kato, T. Kurata, Tracing interaction on otasce map by the visually impaired: Feasibility of adopting interactive route guidance, in: 2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC), IEEE, 2022, pp. 1548–1553.