Augmented Reality Based Technology and Scenarios For Route Planning and Visualization Olga Pavlovaa, Andriy Bashtaa , Sofiia Kravchuka , Yaroslav Hnatchuka and Houda El Bouhissi b a Khmelnytskyi National University, Institutska str., 11, Khmelnytskyi, 29016, Ukraine b LIMED Laboratory, Faculty of Exact Sciences,University of Bejaia, 06000, Bejaia, Algeria Abstract Navigation has always been an object of interest to scientists and business industry representatives. There are plenty of ready-to-use applications that use GPS data, designed to make user's navigation easier. Augmented Reality is currently one of the most popular upcoming technologies most commonly known for its use within games and advertising. By combining navigation with augmented reality, it could be possible to obtain new user friendly applications which will be able to quickly help users in everyday navigation. In this study the applied aspects of information system development for routing and visualization of augmented reality routes have been considered. The relevance of the study has been proved by the results of the survey among first-year students of Khmelnytskyi National University (KhNU) on the necessity of assistance in navigation during the first year of the study. An information system in the form of a mobile application has been developed, which provides assistance in routing in real time and reproducing the saved routes using augmented reality technology. Keywords 1 Smart campus, Augmented reality (AR), Navigation systems, iOS, Mobile Application 1. Introduction The concept of smart cities became quite widespread over the last decade. It includes facilities for dwellers such as smart bus stops, smart parking, inclusive access to city buildings and smart navigation [1]. Currently relevant is the issue of navigating unfamiliar places at short distances, such as a hospital area with plenty of buildings or university campus with lots of different buildings (dormitories, educational buildings, library etc.), where GPS-navigators do not always give accurate data[21]. Nowadays students care for the digitalization and sustainability of their university campuses, work on green-tech projects and want to see their university modern and technological. The territory of Khmelnytsky National University is large enough and occupies 81602.95 square meters. The institution accepts more than 500 new first-year students and about 30 foreign students annually, the survey among the students has been conducted. 78 Ukrainian and 34 foreign students took part in the survey. The results of the survey are shown in Figure 1. IntelITSIS’2022: 3d International Workshop on Intelligent Information Technologies and Systems of Information Security, March 23–25, 2022, Khmelnytskyi, Ukraine EMAIL: olya1607pavlova@gmail.com (O. Pavlova); andreybashta@gmail.com (A. Bashta); sofiya.kravchuk02@gmail.com (S.Kravchuk); hnatchuk_ya@ukr.net (Y. Hnatchuk); houda.elbouhissi@gmail.com (H. El Bouhissi) ORCID: 0000-0003-2905-0215 (O. Pavlova); 0000-0002-0775-1347(A. Bashta); 0000-0003-4472-592X(S.Kravchuk); 0000-0001-9819- 5069 (Y. Hnatchuk) 0000-0003-3239-8255(H. El Bouhissi); ©️ 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings (CEUR-WS.org) Figure 1. Results of the survey among the first-year students of Khmelnytskyi National University According to the survey, 89,5% of Ukrainian first-year students and 100% of foreign first year students consider the campus of Khmelnytskyi National University to be large. 63,2% of Ukrainian first-year students and 71,4% of foreign first year students responded that they needed some help in navigation (finding the necessary academic buildings, hostel, sport building etc) at first. 78,9% of Ukrainian first-year students and 85,7% of foreign first year students responded that in their opinion freshmen need some help navigating the campus of KhNU. 84,2% of Ukrainian first-year students and 85,7% of foreign first year students consider that it would be relevant and useful to create an augmented reality based application for laying the most popular routes on the campus of KhNU to facilitate the navigation of first-year students and their parents during their first stay on the territory of KhNU. Considering the relevance of this issue, it was decided to develop the information system for route planning and visualization in the form of mobile application. For the developing augmented reality technologies and GPS data have been chosen. Therefore, the aim of this work is: 1) сonduct the analysis of modern technologies for navigation using augmented reality and GPS data; 2) develop the client-based part in the form of mobile app which provides planning and visualization of routes; 3) сonduct the experiment on route planning and visualization in Khmelnytskyi National University campus. 2. Domain Analysis Augmented Reality is an area that is closely related to Virtual Reality in that both utilize a subset of the same tools but for different purposes [4], [5]. Both technologies are subsets of mixed reality (MR) that uses various techniques, for instance mobile devices, head mounted displays (HMD), projection and movement tracking systems [3]. VR creates a computer-generated virtual environment that can be interacted with at any time. AR on the other hand takes the real world through a camera of some kind and allows the user to interact with it by placing out virtual objects such as images, objects, and audio in real time. AR is also based on the positioning of the device in some of the applications, allowing for abstraction of large amounts of data as it will only show the data nearby. The three main characteristics that define Augmented Reality[4] are the following: - AR combines real and virtual information. - AR is interactive in real time. - AR operates and is used in a 3D environment. Analysis of sources (Table 1) has been conducted to learn the state-of-the arts in Augmented Reality domain. Table 1 Literature review Augmented Reality domain for navigation Source Purpose Design Method of Results Conclusion Research Bernelind, S. An investigative Master Degree A Qualitative A large set of AR is not (2015)[6]. study if AR can Thesis set of data of the necessarily when be used for interviews parameters walking compared improving the described in [6]. to Google navigation Maps but when driving it has potential. Larsson, M An Master Degree Quantitative A study based AR (2018)[2] investigative Thesis and qualitative on the results of could be used to and comparative study, the survey improve finding study if AR can development of nearby city services be used to AR-based but not in its current improve mobile app for state locating city Android OS by visualizing data instead of 2D means Irshad, S., An effort to A Literature A central gap in It is evident from the &Rambli, summarize the summarizing study the literature has number of findings D.R.A current research study been identified that there is a need (2014)[7] regarding UX of on how to to address UX MAR. design for related issues like quality UX evaluation in experience of MAR. users in MAR. Kipper, G. A A Literature A AR is just starting to (2013)[4] summarization summarizing study comprehensive break out of its of what AR is, study summary of infancy. can be used for AR. and the potential for the future Lando, E. Aid the An A quantitative/ Vague The qualitative data (2017)[9] development investigative qualitative set quantitative suggest that in- and design of study of interviews results and world space have AR applications positive more aspects that at museum qualitative would benefit settings results learning As can be seen from Table 1, review of the literature and academic studies in the AR domain gives merely theoretical background. Surveys of the research literature suggest that AR currently greatly increases driver’s attention. Research in the AR and VR domain is moving fast and several institutes and companies like Tesla, BMW, Pioneer, and Toyota are working on development in AR navigation. [12]. Therefore it was decided to make a review of practical use of augmented reality for navigation (Table 2). Table 2 shows that AR is used not only for in-built navigation systems in driverless cars, but also for custom use in the form of mobile applications. The analysis has shown that AR + Android OS are the most frequently used combination for mobile development. Therefore it was decided to use the combination of such technologies as: iOS operating system, Swift programming language and augmented reality library ARKit, since there are not so many currently existing iOS based mobile applications and there is still the necessity of research and development. Table 2 Review of currently known mobile applications using Augmented Reality for user navigation Name Logo Free or paid Description Gatwick Airport free Android and iOS-based mobile [13] application for indoor navigation through Gatwick Airport AR GPS drive/walk free This Android-based application navigation [14] uses smartphone's GPS and camera to implement an augmented reality-powered car navigation system Real LiveMaps AR free, in-App Android-based live maps for purchases navigation. Use in trips or while traveling. Search makes it easy to plan trips and itineraries. Augmented Reality The addition is in Additon for Android-based Locus extension for Locus Map BETA version and Map application that enables [15] works with Locus visualization of selected points on Map Pro application. the device screen with camera In Locus Map Free, view - in augmented reality. its usage is limited to Useful during town sightseeing 1 minute. tours, on viewtowers, for geocaching or for simple guidance to any point. The add-on is in BETA version and works with Locus Map Pro application 3. Augmented reality-based technology for route planning and visualization As can be seen from Figure 2, AR-based system uses real-time video stream as input data. After processing the video and adding augmented reality elements to build the route that involved the user and environment, the route is being built and saved in the application. This route can be reproduced again once a user needs it or transferred to another user and opened with the same application. Reproducing the ready-built route, that is the output data of the system, also requires user and environment engagement. To automate the work of the AR-based technology, on which the work of mobile application is based, it is necessary to make the decomposition of the system. The structure of the technology is shown in Figure 3. The work of the technology requires the camera of the smartphone to be enabled. First, iOS requires the camera authorization from the user, and shows the “Allow” button. Every time when a user opens the application this requirement is shown and needs to be checked by the user. Then, the CoreLocation framework is used to get user location from the iOS system. After that, the recording can be started — camera session starts, camera preview layer is displayed on the screen. User has one option - tap on the “Start” button, then the 3, 2, 1 timer is displayed and the user can start going to any point and record their way. This entire route is recorded using small 3D objects, which are called SCNodes. SCNodes are parts of SceneKit framework [17]. Figure 2: Functional block diagram of mobile application for real time routing based on AR- technology Then all these nodes are connected to one UIBezierPath object. All the information is converted to container NSSecureCoding, which can be saved to device memory. All the routes are available to save and can be reproduced when needed. User can reproduce any needed recorded route from point to point in real time, using AR-based Data on Device location and orientation in Camera View. Figure 3: Structural diagram of AR-based technology for route planning and visualization To work with augmented reality technology using the ARKit library, it should be noted that during the experiment (setting a 3D route) the phone will be in three-dimensional space (Figure 4), so it is necessary to consider several aspects. The coordinate system used in ARKit is a right-hand coordinate system. The reduction of the first coordinate of an augmented reality object is to bring together three coordinate systems - the world coordinate system, the object coordinate system and the camera coordinate system in one point [19]. The world coordinate system is determined when the user's device recognizes the outer space to start the session. An object that is placed in augmented reality space is given an absolute position in this coordinate system. The coordinate system of the object in which the visualization is performed: the object must be placed in the world coordinate system, while having its absolute position. The coordinate system of the camera must match the coordinate system of the object. After launching ARKit, the user moves the smartphone in space. At this time, the camera changes the coordinates from the world coordinate system and the object is already in the coordinate system of the camera [19]. Figure 4: Location of the user's phone in three-dimensional coordinate system (in 3D space) [20] The relationship between the camera's coordinate system and the screen's coordinate system can be expressed using the formula (Figure 5). Figure 5: Relationship between camera coordinate system and screen coordinate system [20] Thus, the matrix of the projection of the image on the screen P can be determined by the Formula 1: 𝑥 0 0 0 0 𝑦 0 0 𝑃=( ) (1) 0 0 (𝑓 + 𝑛)⁄(𝑛 − 𝑓) 2𝑓𝑛⁄(𝑛 − 𝑓) 0 0 −1 0 where x and y regulate the angle of view and the ratio of the sides, f and n - the depth of distance and approximation to a certain distance, respectively [17]. Thus the projection of a point in the world coordinate system z = (0 0 z 1) T is given by calculating p = P * z, and then using the perspective distribution q = p / p.w. Next we need to calculate the depth of the window by the formula winZ = (q.z + 1) / 2. Following these formulas, we can derive a relationship between winZ and the z coordinate of our point on the z axis. Namely, winZ = (p.z / p.w + 1) / 2 p.z / p.w = 2 * winZ-1 p.w = -z p.z = z * (f + n) / (n-f) + 2fn / (n-f) (z (f + n) / (n-f) + 2fn / (n-f)) / (- z) = 2winZ-1 z = fn / (f * winZ-n * winZ-f) or equivalent to winZ = f (n + z) / (f-n) z) (2) It is notable that when z = -f, then winZ = 1, and when z = -n, then winZ = 0. Meanwhile, we have the inverse relationship (Figure 6): Figure 6. Inverse relationship between camera coordinate system and screen coordinate system [20] During the research, an augmented reality-based information system has been developed in the form of a mobile application for iOS using Swift programming language and ARKit library. This system allows to record the route from point A to point B in real time, save it and display it using augmented reality (the direction of movement from point A to point B is shown by arrows) at the request of the user. All user-recorded routes are stored in the mobile application database and available for sharing for users who have the app installed on their phones. In the course of the research, an augmented reality-based information system has been developed in the form of a mobile application for iOS using Swift programming language, ARKit library and SceneKit framework. This system allows to record the route from point A to point B in real time, save it and display it, using augmented reality (the direction of movement from point A to point B is shown by arrows) at the request of the user. All user-recorded routes are stored in the mobile application database and available for sharing for users who have the app installed on their phones. 3. Experiments The experiments have been conducted over the campus of Khmelnytskyi National University (KhNU). The following technologies have been used to accomplish the task of the research: iOS operating system, Swift programming language and ARKit augmented reality library. Two of the most popular routes were plotted on the campus plan (Figure 7). Figure 7. KhNU campus plan with marked routes for visualization using augmented reality information technology The route marked yellow is the route from academic building 3, where the administration of KhNU is located and the classes are held, to Hostel 3, where the students of IT faculty live. The route marked red is the route from Hostel 4, where foreign students live, through the botanical garden passing the workshors base, where some practical classes on Technical disciplines are being held, towards academic building 3. Figure 8 shows interface windows of the developed AR-based information system for route planning and visualization. a) main screen (after the authorization); b) main menu with the saved routes; c) the process of routing using augmented reality. a) b) c) Figure 8. Interface windows of AR- based mobile application for routing and visualization the routes. 4. Results of performance testing & Discussion The quality and the efficiency of ARroute application was comparatively evaluated using the main performance parameters, such as Memory usage, Application Launch Performance, Memory Leaks, maximum CPU usage, Energy Impact and quantity of frames per second (FPS). Proposed architecture and software development model in this paper works quite well as compared to the other similar applications. The comparative analysis has been conducted between ARroute and Feed Me application [21], which uses Google Maps iOS SDK i.e. literally can be perceived as Google Maps itself. The launch has been performed on iPhone 11 with iOS 14.7.1. Also proposed in this paper ARroute application is more user friendly as compared to similar systems, in our case Feed Me, as it has an intuitively built interface and understandable principle of use. Figure 9 shows that panorama views and 3D routing provided by Google Maps use already taken and saved by other user videos. For the experiment we tested 3D route across Botanical Garden. The time stamp shows that this video has been taken in April 2019. Therefore we can conclude that Google Maps do not provide real time 3D routes paving. In return ARroute provides real time route paving. It helps user to intuitively build the route using augmented reality markers. The saved route can be reproduced using a device camera in real time once needed. Figure 9. 3D route across KhNU Botanical Garden used Google Maps application Table 3 shows the comparative analysis of ARroute application and Feed Me (Google Maps) in the main performance characteristics. Table 3 Results of performance testing of ARroute and Feed Me(Google Maps) application using Xcode tool. Characteristic Feed Me application ARroute application (Google Maps) Memory usage 137 MB per 188 seconds 185 MB per 189 seconds Application Launch Performance 16.44 seconds 3.76 seconds Memory Leaks No leaks No leaks Maximum CPU usage 67% 85% Energy Consumption High Low Frames Per Second (FPS) 49 57 Memory usage is mentioned as one of indicators for efficiency and correctness. So we use this parameter as an efficiency indicator in comparison with other projects. To calculate memory usage, the compared applications have been run with a time duration of approximately 3 minutes and have been actively used. The given experiment data take the average value of each category. Application launch performance time indicates how much time it takes for application to run. The results of the experiment showed that ARroute needs less time to launch. Memory leaks happen when no longer needed objects in memory that can not be released for various reasons and can lead to memory loss, poor performance and due to closing, departure or automatic application crash. According to the experiment, in both applications there are no problems with memory leak. In return ARroute has higher level of CPU usage at peak times. Energy Impact shows how much battery power the device consumes to perform tasks. This rate is very high in Feed Me and low in ARRoute. Frame rate (FPS) is the frequency (speed) at which the image processing device displays consecutive images called frames. This indicator shows how quickly, smoothly and clearly the graphics of the application is displayed on the smartphone screen. In spite of using augmented reality technology, ARroute has a higher FPS indicator than Feed Me (Google Maps), which only uses the user's location. 5. Conclusions Thus, the proposed augmented reality-based information system for routing and route visualization provides quick and accurate route paving and saving the route with the following display at the request of the user. During the study the survey among students was conducted, which showed the relevance and the necessity of the research and development of the proposed mobile application. Also the literature analysis and analysis of already existing AR-based mobile applications provided the conclusions about the technologies that have been used for the development of the proposed augmented reality- based information system for routing and route visualization. The performance testing, which has been performed during the experiment, showed that the proposed ARroute application has high application launch performance, no memory leaks, low energy consumption and high image processing frequency.The developed application works quite well, has a user friendly and intuitive interface. The further efforts of the authors will be directed to improving the existing algorithms for the work with augmented reality technology, conducting more experiments and improving the proposed mobile application. 6. References [1] M.Larsson. Geographical Visualization Within Augmented Reality, Master’s thesis, Royal Institute of Technology (KTH), Stockholm, Sweden, 2018. [2] G. Y. Kostov. Fostering Player Collaboration Within a Multimodal Co-Located Game, 2015 p. 66. [3] G. Kipper What Is Augmented Reality? (2013) [4] B. Furht. Handbook of Augmented Reality. (2013) Springer Science & Business Media. [5] S. Bernelind (2015) Navigation in Augmented Reality. [6] S. Irshad, D. R. A.Rambli .User experience of mobile augmented reality: A review of studies, in 3rd International Conference on User Science and Engineering (i-USEr), 2014, pp. 125–130. [7] S.Irshad, Rambli D. R. A. Preliminary user experience framework for designing mobile augmented reality technologies, 4th International Conference on Interactive Digital Media,no. Icidm, 2015, pp. 1–4. [8] E. Lando, How Augmented Reality Affects the Learning Experience in a Museum, 2017. [9] G. Bhorkar. A Survey of Augmented Reality Navigation, Aalto University, 2017 https://www.researchgate.net/publication/319164069. [10] Gatwick Airport navigation () http://airport-suppliers.com/gatwick-installs-2000-indoor- navigation-beacons-enabling-augmented-reality-wayfinding-world-first-airport, 2017. [11] AR GPS Walk/Drive navigation mobile App (2021) https://apkfab.com/ar-gps-drive-walk- navigation/com.w.argps. [12] Locus Maps – an application for navigation (2021) https://www.locusmap.app/ [13] M.M.Youssef, S.A. Mousa, M.O. Baloola, B.M. Fouda .The Impact of Mobile Augmented Reality Design Implementation on User Engagement. In: Singh M., Gupta P., Tyagi V., Flusser J., Ören T., Valentino G. (eds) Advances in Computing and Data Sciences. ICACDS 2020. Communications in Computer and Information Science, vol 1244. Springer, Singapore. https://doi.org/10.1007/978-981-15-6634-9_10 [14] iOS Developer Documentation (2022) https://developer.apple.com/documentation/scenekit/ [15] Brata, K.C., Liang, D., Pramono, S.H. (2015). Location-Based Augmented Reality Information for Bus Route Planning System. International Journal of Electrical and Computer Engineering, 5, 142-149. [16] 3D Math for ARKit (2021) https://titanwolf.org/Network/Articles/Article?AID=3d944b62-a371- 4461-9eaa c19589a5c1c0#gsc.tab=0 . [17] Understanding OpenGL screen z (depth) values (2021) http://www.alecjacobson.com/weblog/?p=3835. [18] Google Maps iOS SDK Tutorial (2020): https://www.raywenderlich.com/7363101-google-maps- ios-sdk-tutorial-getting-started. [19] T. Hovorushchenko, O. Pavlova, D. Medzatyi, Ontology-Based Intelligent Agent for Determination of Sufficiency of Metric Information in the Software Requirements. Advances in Intelligent Systems and Computing 1020 (2020) 447-460. doi: 10.1007/978-3-030-26474-1_32. [20] T. Hovorushchenko, O. Pavlova, M. Bodnar, Development of an intelligent agent for analysis of nonfunctional characteristics in specifications of software requirements. Eastern-European Journal of Enterprise Technologies 1 2 (2019) 6-17. doi: 10.15587/1729-4061.2019.154074. [21] T. Hovorushchenko, O. Pavlova, V. Avsiyevych. Method of Assessing the Impact of External Factors on Geopositioning System Operation Using Android GPS API. Proceedings of 2021 IEEE International Scientific and Technical Conference “Computer Science and Information Technologies” (СSIT-2021, Lviv-Zbarazh, Ukraine, September 22-25, 2021) – Pp. 295-298, doi: 10.1109/CSIT52700.2021.9648811.