Review of UAV positioning in indoor environments and new proposal based on US measurements M. C. Pérez, D. Gualda, J. de Vicente, J. M. Villadangos, J. Ureña Department of Electronics, University of Alcalá, Spain mcarmen.perezr@uah.es Abstract. The use of unmanned aerial vehicles (UAVs) has increased dramatically in recent years be- cause of their huge potential in both civil and military applications and the decrease in prize of UAVs products. Location detection can be implemented through GNSS technology in outdoor environments, nevertheless its accuracy could be insufficient for some applications. Usability of GNSS in indoor en- vironments is limited due to the signal attenuation as it cross through walls or the absence of line of sight. Considering the big market opportunity of indoor UAVs many researchers are devoting their efforts in the exploration of solutions for their positioning. Indoor UAV applications include location based services (LBS), advertisement, ambient assisted living environments or emergency response. This work is an update survey in UAV indoor localization, so it can provide a guide and technical comparison perspective of different technologies with their main advantages and drawbacks. Finally, we propose an approach based on an ultrasonic local positioning system. Keywords: Unmanned Air Vehicle (UAV), indoor positioning, performance evaluation, ultrasonic LPS. 1 Introduction The use of UAVs, commonly known as drones, in indoor environments allows to accelerate the data or information collection and the access to complicated areas. Technological advance developments, such as long-lasting batteries, miniaturization, incorporation of a variety of sensors and cost reduction have made UAVs more affordable and attractive for civilian market. Thus, some new areas of application research are emerging such as advertising, social networks, gaming, security, emergence response or other location based services. In all cases, it is necessary to know the UAV position with more or less accuracy depending on the application. GNSS cannot be reliable used for indoor environment due to is poor coverage indoor and the insufficient accuracy. Nevertheless, it seems that there is not a single technology that can solve by itself indoor location able to cope with different grades of accuracies (even centimetre positioning in certain places and more relaxed in others) [1]. Thus, the need for specialized methods, technologies and sensor fusion is widely accepted not only in UAV location but also in general indoor positioning systems (IPS) [2]. There are not many surveys in UAV indoor navigation, mainly because of the novelty of the topic. [1] is an example, although it is very focused on works that fuse IMU measurements with Camera (vision) information, through SLAM or visual odometry methods. Hence, it omits several important technologies also applied to UAVs, such as Signal of Opportunity (SOP) or UWB. On the other hand, the study does not include some useful information as it could be the approximate accuracy, coverage area, costs and strengths or weakness of the evaluated proposals. Many more surveys and review studies can be found related to IPS for user or robot localization [3-6]. Some useful information can be found from these surveys, although UAVs localization problem is more complex and has to deal with additional inconveniences such as the velocity of the drone, the stabilization problem, limited computation of onboard processor, battery duration, weight restrictions if additional hardware has to be added or air turbulences and acoustic noise caused by the motor in case of using acoustic technology. A good guide for beginners in IPS could be [5] where an extensive review of the main technologies depending on their accuracy and applications can be found. The work in [3] it is more manageable and provides a table with an interesting IPS classification and a collection of following trends and useful considerations. This Work in Progress Paper extends the review in [1] by considering most promising technological approaches for UAVs indoor localization, providing a comparison table with useful information regarding 2 accuracy, cost, coverage and other important considerations. As well as this, we evaluate the performance of Ultrasonic Local Positioning Systems (U-LPS) for UAVs position estimation. This technology is not as commonly used as cameras in UAV, but it is cheap and has the potential to centimetre positioning if im- pairment factors such as the Doppler effect or propeller noise are compensated. U-LPS can be a good option to be used together with other technologies [7]. The rest of the document has been structured as follows: Section II is a comparison of the main literature devoted to indoor UAV and reviews the most used positioning technologies in this field. Section IV presents a new proposal using ultrasonic signals coming from static beacons in the environment. Finally, the main conclusions are drawn in Section V. 2 Technologies comparison and literature review To guess the trends in UAV indoor localization we have used the advance search tool offered by IEEE Xplore [8]. The study has been performed considering an advance search through metadata information (title, abstract and indexing terms) of the papers, considering that most authors highlight the key technolo- gies used in that fields. Note that many works mentioned combination of several technologies. Only 33 results are obtained when the search is focused on indoor UAV positioning (Fig. 1a), where as approxi- mately 400 works appear when the search considers 3D indoor positioning, including not only UAVs but also personal and robot navigation applications (Fig. 1b). In any case, the predominance of optical camera and inertial measurements can be observed. Here on we review the main characteristics of the technologies that are most employed in UAV localization and show some related work. (a) Main technologies for 3D UAV indoor positioning (b) Main technologies for 3D indoor positioning Fig. 1. Overview of technology trends in the field of indoor positioning in IEEE Xplore Digital Library. 2.1 Computer vision 54% of the indoor UAV works use an optical camera, usually accompanied by other technology. Many UAVs come with cameras, so the motion of images can be used for positioning and mapping purposes. Some of the works use visual odometry (VO), while others prefer Simultaneous Location and Mapping (SLAM) ([1] collects some representative examples). [9] defines VO as the process of egomotion estima- tion of an agent (UAV in this case) by using only the input of a single or multiple cameras attached to it; and SLAM as the process in which the agent localize itself in an unknown environment while the camera is moving and build a map at the same time with any prior information. An example of VO work is presented in [10] which estimates the 6 degree of freedom AsTec Pelican Quadrotor pose with the aid of an ARM- based stereo vision pre-processing system. Another approach using stereo vision sensor but with off board cameras for indoor UAVs is presented in [11]. [12] is an example coming from an European project to create a set of vision-controlled micro UAV capable of autonomous navigation and 3D mapping in GPS- denied environments by using only a single onboard camera with monocular SLAM fused with inertial measurements, the computation is made onboard in real-time. Offline, a dense-mapping processing merges the individual maps from each UAV into a single global map. In general, this technology offers accuracies 3 that can vary from cm, or even less, to m with coverage areas that can cover large rooms and typical update rates higher than 10Hz [5]. Nonetheless, the required computation times are still a bottle-neck for real-time use [7]. 2.2 Inertial technology 29% of the total papers found uses Inertial Navigation Systems (INS) as primary technology. In these cases, the position, velocity and orientation estimations come from an Inertial Measurement Unit (IMU). The IMU consists of three accelerometers, three gyroscopes and/or a magnetometer. Coming from an initial position and knowing the orientation and speed, the future locations can be estimated without the need of external infrastructure. The main problem of this method is that it suffers from accumulation of position and angular deviation over the travelled distance (a small error in direction can originate a huge positioning error if a long distance is travelled). Usually, inertial methods are combined with other sensors in order to achieve a good performance and correct the drift as in [12] with cameras or as in [13] with UWB and an optical flow sensor. 2.3 Infrared (IR) Nowadays, IR, RFID, WLAN, Bluetooth or ultrasounds (US) are less used for indoor UAV positioning as the already mentioned cameras, IMUs or UWB based systems. IR uses electromagnetic radiation with wavelengths longer than that of visible light, so it is invisible to human eye under most conditions. Typical accuracies can vary from centimetres (artificial IR light sources) to meters (in active beacons or when nat- ural radiation is used) [5], with drawbacks such as line of sight requirements (LOS), reflectivity, scattering, or they can be adversely affected by sunlight. [14] is one of the first references on indoor UAVs localisation using IR-LPS. It consists of one transmitter fitted to the UAV and three infrared receivers at known posi- tions of the environment. Limitations were the short range of the system (approximately 2 m), and the need of line of sight. Recently, in [15] a system based on IR beacons, computer vision and IMU sensor data fusion was proposed for autonomous UAV landing when there is no GNSS availability. Autonomous land- ing implies precise localization of UAV under real-time constraints. Challenges include limited power of beacons, background illumination and the need to distinguish between similar looking beacon images. Ac- curacies were obtained through simulation tests and vary from ten of centimetres to meters depending on the distance from the landing point. 2.4 Radio Frequency (RF) RF-based systems are widely used in positioning as they can take advantage on the installed communi- cations infrastructure. RF includes Wireless Local Area Network (WLAN also referred to as WiFi), RFID, UWB, LTE, Bluetooth or ZigBee among others. The main methods for location in RF based systems in- clude signal strength fingerprinting (which requires an extensive previous effort on mapping and collection of WiFi patterns) or time of arrival measurements (that must face tough indoor signal propagation condi- tions). 2.4.1 Ultrawideband (UWB) This technology is one of the most used in 3D indoor positioning, usually helped by other technology. It is a radio technology for short-range high-bandwidth communication that is able to provide centimetre accuracy. UWB achieves strong multipath resistance since the wide bandwidth makes easier the detection of the time-delayed versions of the emitted signal; it also achieves good material penetrability. Neverthe- less, it can be adversely affected under strong scattering conditions and requires dedicated infrastructure [5, 7]. UWB uses either Time of Arrival (ToA) or Time Difference of Arrival (TDoA). J. Tiemann et al. [13] presented a system based on TDoA UWB that uses eight UWB receiver nodes distributed over the labora- tory and a tag located in the top of the UAV. It also integrates some available sensors in the drone: the optical-flow IMU sensors for velocity and yaw feedback. The 3rd quartile of the alignment errors was lower than 10 cm. [16] merge thorough a Monte Carlo Localization (MCL) algorithm information of UWB sen- sors, with visual odometry on a RGB-D camera considering a previously built multi-modal map with the UWB sensors location. The VO provides reliable short-term pose estimation correcting the drift thanks to 4 UWB measurements; on the other hand, the UWB outliers can be filtered thanks to the VO. Experimental results in a 15x15x5 m scenario equipped with a motion tracking system to provide a millimetre ground- truth show errors below 22 cm. 2.4.2 Radio Frequency Identification (RFID) Consists of RFID readers with an antenna that interrogates active transceivers or passive tags in order to obtain the tag’s unique identification. RFID tags are known as passive tags if they do not require batteries but operate by means of inductive coupling. On the contrary, active tags incorporate their own batteries. Reported accuracies are around 1-5 m, which is insufficient for many indoor applications. [17] explores the RFID-UAV field. 2.4.3 Wireless Local Area Network (WLAN) WLAN is commonly available in many indoor environments, so it can be used in a Signal-Of-Oppor- tunity (SOP) basis to estimate the location at a reduced cost. Signal Strength Indicator (RSSI) is quite cheap and low invasive but is not an accurate method, so it is usually combined with other technologies, as ultra- sound signals [18]. [19] proposes a Multi-dimensional Scaling (MDS) and Weighted Centroid Localization (WCL) to compute the location based on distance measurements between the UAV and the existing WiFi infrastructure. The authors claim localization errors less than 5% of the radio range in a non-complex en- vironment. 2.4.4 Bluetooth It is a wireless standard for Wireless Personal Area Networks created by Bluetooth Special Interest Group and it was originally intended for data exchange in short distances. BLE (Bluettoth Low Energy) can be an interesting option due to their low cost and very low power consumption. Nevertheless, according to Mautz [5] Bluetooth does not offer a reliable accuracy (time of flight, RSSI and fingerprinting works were evaluated offering room accuracy). Anyway, there are some authors working on Bluetooth and other wireless standards such as ZigBee. For instance, [20] presents an UAV patrol system based on BLE location in the areas where the GNSS coverage is poor. 2.5 Ultrasound and acoustic signals There are only a few papers related to 3D UAV ultrasound indoor positioning. [21] presents a system based on five ultrasonic transducers and IMU measurements to implement a 3D wall-following algorithm. [7] uses a Time-of-Flight camera to obtain an initial estimation for the vehicle height and an encoded ultra- sonic U-LPS to compute the horizontal vehicle positioning through a 2D multilateration procedure. In [22] a U-LPS with five emitting beacons is placed on the ceiling and the performance of the ultrasonic receiver at different heights over the drone is evaluated in terms of 3D estimations. The authors refer the problem derived from the saturation of the received signals because of the drone acoustic noise. Section 3 includes some preliminary results for an ultrasonic based UAV localization system. 2.6 Other technologies There are other technologies that can also be applied for drone positioning. For instance, Visible Light Communications (VLC) is investigated to be used in drones in [23], where challenges as the placement of ground units, data rate, long range, fast-speed mobility and energy management were identified. Laser scanner (LIDAR) has also been evaluated in [24, 25] obtaining high accuracies, nevertheless it is an expen- sive technology, requires LOS and the position estimations can be degraded at high sun angles and reflec- tions. Table 1 is a comparison of several proposals for indoor UAV positioning. For each technology, some representative references have been included and classify according to: accuracy (it indicates the difference between the estimated position and the actual one); coverage (it refers to the extension in which the system 5 can be located with the used technology); cost (here we have considered cost for each end user); location method (how location estimations are obtained); and, finally, advantages and drawbacks in each case. Table 1. Comparison of UAV indoor location proposals. Ref. Year. Technology Accuracy Coverage Cost Location Advantages/drawbacks method [7] 2018 Camera, US cm (in Room Low TDOA, Good precission/Dop- static test ToF cam- pler effect, acoustic positions) era, Gauss noise Newton [10] 2015 Stereo vision 10 m Room Low VO Size and weight/sensi- tive to light conditions [13] 2017 IMU, Optical, 10 cm Room High UWB Good accuracy/ High US, GNSS, TOA, EKF cost UWB [15] 2015 Infrared, com- 10cm-4m Room Low Particle Fil- Easy to deploy/LOS de- puter vision, tering pendence, sunlight inter- IMU ference [16] 2017 UWB, RGB- 20 cm Room High VO, UWB Good accuracy/high cost D sensor RO-SLAM, MCL [19] 2017 WiFi 1m Building Low MDS, Reuse infrastruc- WCL ture/Tradeoff between accuracy and complex- ity [20] 2017 Bluetooth, 2m Room/Out- Low RSSI Low consumption and GPS doors cost/ needs signal map- ing, low precision 3 Proposal of an UAV positioning based on US measurements As can be observed in Fig. 1, ultrasound technology is not a common choice for UAV positioning. Nev- ertheless, recent works have arisen [7][18] that demonstrate their feasibility to refine 3D positioning and their potential when used with other technologies. Here we present some preliminary test in a laboratory environment using only ultrasonic signals. The ultrasonic signals are generated by an Ultrasonic Local Positioning System (U-LPS) called LOCATE-US [26], which was developed by the GEINTRA research group of the University of Alcalá [27]. The U-LPS is composed of five ultrasonic emitters that are managed by an FGPA Xilinx Zynq 7000 [28, 29]. Each emitter transmits a 255 bits Kasami sequence [30] modulated with a Binary Phase Shift Keying (BPSK) carrier centered at 41.67 kHz. The samples of modulated Kasami sequences are saved in a FPGA and are read with an output rate of 500 kHz frequency while the acquisition frequency at the receiver is 100 kHz. An ad-hoc ultrasonic acquisition module [31] is placed over a Parrot Bebo 2 drone [32] to capture the ultrasonic signals, then the processing has been done in a laptop computer. Fig. 2 shows the localization area which the ULPS is installed on the ceiling (around 3 m of height) and the UAV flying inside the US coverage. 6 Fig. 2. UAV position estimation test inside the localization area covered by the LOCATE-US system. Regarding the initial experimental tests, these were carried out flying the UAV at an approximate stable height of around 1.4 m. We obtained 100 measurements that were processed to get the position estimations solving a non-linear equation system composed of four difference of distances between the receiver and the emitter, instead of absolute distances since there were not synchronization between the transducers and the receiver. The method used for solving the equation system was based on the known Gauss Newton algo- rithm, but it is possible to solve the position using another non-linear equation resolution method. Fig. 3 shows the visual results of the experiment, the average position estimations related to the 100 measures and the 95% confidence ellipse around the mean estimation. Fig. 3. UAV position estimation results. Furthermore, it can be observed the standard deviation results of each coordinate in the top of the figure. In this case, there was a high variance (0.28 m) related to the receiver height estimation, due to geometrical considerations and the fact that the UAV position estimation was solved by using difference of distances. 7 Nevertheless, the dispersion related to the x and y coordinates was quite low (0.09 m and 0.05 m, respec- tively), so the error in the 2D plane was not very high (up to one decimeter). In future works we will try to determine the ground truth of the UAV position by using another localization technology more accurate than ultrasounds, such as cameras or laser measurements, and perform the low- and high-level processing for location estimation onboard. 4 Conclusions We have presented a survey of different works regarding UAV localization in indoor environments. Indoor UAV implies an interesting field due to the variety of sensor technologies that can be applied and the difficulties they have to face: drone velocity, strong multipath, NLOS, computational load, etc. Trends points to the use of hybrid systems that use complementary technologies such as SOP signals and inertial navigation, ultrasound and IMU or UWB and cameras. We have also presented some preliminary tests based on ultrasonic signals. Acknowledgment. This work has been possible thanks to the University of Alcalá (project UAH-AE2017- 4), Junta de Comunidades de Castilla La-Mancha (FrailCheck project, SBPLY/17/180501/000392), and the Spanish Ministry of Economy and Competitiveness (MICROCEBUS, TIN2018-095168-B-C51). References 1. Balamurugan, G., Valarmathi, J., Naidu, VPS.: Survey on UAV Navigation in GPS Denied Environments. Inter- national Conference on Signal Processing, Communication, Power and Embedded System (SCOPES), 198-204, 2016. 2. Bahillo, A., Aguilera, T., Álvarez, F. J. and Perallos, A.: WAY: Seamless Positioning Using a Smart Device. Wireless Pers Commun, vol. 94, pp. 2949-2967, 2017. 3. Brena, E. F., García-Vázquez, J. P., Galván-Tejada, C. E., Muñoz-Rodriguez, D. Vargas-Rosales, C. and Fangmeyer J. Evolution of Indoor Positioning Techologies: A Survey. Hindawai, Journal of Sensors, pp. 1-21, 2017. 4. Liu, H., Darabi, H., Banerjee, P. and Liu, J. Survey of wireless positioning techniques and systems. IEEE Trans- actions on Systems, Man and Cybernetics Part C: Applications and Reviews, (37), no. 3, pp. 263-282, 2008. 5. Mautz, R. Indoor positioning technologies, (PhD thesis), Habil, ETH ZÚrich, 2012. 6. Deak, G. , Curran, K. and Condell. J. A survey of active and passive indoor localization systems. Computer Com- munications, vol. 35, no. 16, pp. 1939-1954, 2012. 7. Paredes J. A., Álvarez, F. J., Aguilera, T. and Villadangos, J. M. 3D Indoor positioning of UAVs with spread spectrum ultrasound and Time-of-Flight Cameras, Sensors, vol. 18, no. 89, 2018. 8. IEEE Xplore https://ieeexplore.ieee.org/Xplore/home.jsp last accessed 2019/05/21 9. Yousif, K. Bb-Hadiashar, a. and Hoseinnezhad, R. An overview to visual odometry and visual SLAM: Applica- tions to Mobile Robots. Intelligent Industrial Systems, vol. 1, no. 4, pp. 289-311, 2015. 10. Fu, C., Carrio, A. and campoy, P. Efficient visual odometry and mapping for Unmanned Aerial Vehicle using ARM-based stereo vision in pre-processing system. International Conference on Unmanned Aircraft Systems (ICUAS), 2015. 11. Mustafah, Y. M., Azman, A. W., Akbar, F., Indoor UAH Positioning Using Stereo Vision Sensor. International Symposium on Robotics and Intelligence Sensors, vol. 41, pp. 575-579, 2012. 12. Scaramuzza, D. et al. Vision-Controlled Micro Flying Robots, IEEE Robotics & Automation Magazine, pp. 26- 40, 2014. 13. Tiemann, J. and Wietfeld. C.: Scalable and Precise Multi-UAV Indoor Navigation using TDOA-based UWB Lo- calization. In: International Conference on Indoor Positioning and Indoor Navigation (IPIN), 2017. 14. Kirchner, N. and Furukawa, T.: Infrared Localisation for Indoor UAVs. In: International Conference on Sensing Technology, 2005. 15. Khithov, V., Petrov, A., Tischenko, I. and Yakolev, K. Towards Autonomous UAV Landing Based on Infrared Beacons and Particle Filtering. 4th International Conference on Robot Intelligence Technology and Applications, pp. 529-537, 2015. 16. Perez-Grau, F.J., Caballero, F., Merino, L., and Viguria, A.: Multi-modal Mapping and Localization of Unmanned Aerial Robots based on Ultra-Wideband and RGB-D sensing. In: IEEE/RSJ International Conference on Intelli- gent Robots and Systems (IROS), 2017. 17. Casati, G, Longhi, M., Latini, D., Carbone, F., Amendola, S., Frate, F., Schiavon, G. and Marrocco, G. The inter- rogation footprint of RFID-UAV: electromagnetic modelling and experimentations. IEEE Journal of Radio Fre- quency Identification, vol. 1, no. 2, 2017. 8 18. Cheng, L., Wu, C., Zhang, Y. and Wang, Y. An indoor localization strategy for a mini-UAV in the presence of obstacles. International Journal of Advanced Robotics Systems. Vol. 9, no. 4, pp. 153, 2012. 19. Stojkoska, B. R., Palikrushev, J., Trivodaliev, K. and Kalajdziski, S. Indoor Localization of Unmanned Aerial Vehicles Based on RSSI. 17th International Conference on Smart Technologies. IEEE EUROCON, 2017. 20. Zhou, M., Lin, J., Liang, S., Du, W. and Cheng L. A UAV Patrol System Based on Bluetooth Localization. 2017 2nd Asia-Pacific Conference on Intelligent Robot Systems, 2017. 21. Shang, C., Cheng, L., Yu, Q., Wang, X. and Peng, R.: Micro Aerial Vehicle Autonomous Flight Control in Tunnel Environment. In: International Conference on Modelling, Identification and Control, 2017. 22. Gualda, D., Ureña, J., Pérez, M. C., Posso, H., Bachiller, S. and Nieto, R. 3D Position Estimation of an UAV in Indoor Environments using an Ultrasonic Local Positioning System. 9th International Conference on Indoor Posi- tioning and Indoor Navigation, 2018. 23. Ashok, A. Position: DroneVLC: Visible Light Communication for Aerial Vehicular Networking. Proceedings of the 4th ACM Workshop on Visible Light Communication Systems, 2017. 24. Guerra, E., Munguía, R., and Grau, A. UAV Visual and Laser Sensor Fusion for Detection and Positioning in Industrial Applications. Sensors, vol. 18, 2071, pp. 1-20, 2018. 25. Opromolla, R. Fasano, G. Rufino, G., Grassi, M. and Savvaris, A. LIDAR-Inertial Integration for UAV Localiza- tion and Mapping in Complex Environments. 2016 International Conference on Unmanned Aircraft Systems (ICUAS), 2016. 26. Hernandez, A., Garcia, E., Gualda, D., Villadangos, J. M., Nombela, F., Ureña, J.: FPGA-Based Architecture for Managing Ultrasonic Beacons in a Local Positioning System. IEEE Transactions on Instrumentation and Meas- urement 66(8), 1954–1964 (2017). 27. Geintra group (University of Alcalá), http://www.geintra-uah.org/, last accessed 2019/05/24. 28. Pro-Wave Electronics Corporation, Air Ultrasonic Ceramic Transducers 328ST/R160, Product Specification, (2014). 29. Xilinx, inc., Zynq-7000 All Programmable SoC Technical Reference Manual, User Guide, (2014). 30. Kasami, T.: Weight distribution formula for some class of cyclic codes, Report no. R-285, University of Illinois (1966). 31. Ureña, J., Hernández, A., García, J. J., Villadangos, J. M., Pérez, M. C., Gualda, D., Álvarez, F. J., and Aguilera, T. Acoustic Local Positioning with Encoded Emission Beacons. Proceedings of the IEEE. Vol. 106, no. 6, 2018. 32. Parrot Bebop 2, https://www.parrot.com/es/drones/parrot-bebop-2-fpv#pack-bebop-2-fpv, last accessed 2019/05/24.