=Paper= {{Paper |id=Vol-3191/paper18 |storemode=property |title=An Approach to Skid Detection Using Optical Flow and IMU Sensor (short paper) |pdfUrl=https://ceur-ws.org/Vol-3191/paper18.pdf |volume=Vol-3191 |authors=Nikola Totev,Boris Robev }} ==An Approach to Skid Detection Using Optical Flow and IMU Sensor (short paper)== https://ceur-ws.org/Vol-3191/paper18.pdf
An Approach to Skid Detection Using Optical Flow
and IMU Sensor
Nikola Totev 1 and Boris Robev 1
1
 Faculty of Mathematics and Informatics, University of Sofia “St.Kliment Ohridski“,
James Bouchier blvd., 1164, Sofia, Bulgaria


             Abstract
             Estimating the position of mobile robots is a key part of mobile robots
             performance and productivity and it is based on the different localization
             methods and sensors to be used. The goal of this paper is to propose a
             method of accurately estimating the position of a mobile robot using an
             optical flow sensor together with an IMU (Inertial Measurement Unit)
             sensor and encoders. Using these three sensors for observing and monitoring
             the movement of the robot, most expected and unexpected movements can
             be accounted for as well as a good pose estimation can be performed more
             accurately. The paper proposes an approach to prevent skidding of mobile
             robots and mitigate and avoid such skidding with an appropriate localization
             and position estimation modeling of mobile robots that uses multiple and
             different sensors. All sensors track and measure overall velocity, angular
             velocity, orientation and positioning, as well as acceleration of the robot.
             A localization algorithm is based on the probabilistic localization methods
             and Kalman filter algorithm.

             Keywords
             Skid detection, optical flow, sensor fusion, Kalman filter, SLAM

1. Introduction
     Robots can be put into one of two groups, industrial and service. According
to the International Federation of Robotics (IFR) – “Service robots are technical
devices that perform tasks useful to the well-being of humans in a semi or fully
autonomous way” [1]. The IFR applies the definitions and standards for service
robots according to ISO 8373:2012 [4]. In some cases, there are details that ac-
cording to IFR definitions are not applicable or lead to the specific ambiguous
definition of different categories of robots from the generally accepted standard.
To distinguish industrial robots from service robots, IFR accepts ISO criteria de-

Information Systems & Grid Technologies: Fifteenth International Conference ISGT’2022, May 27–28, 2022, Sofia, Bulgaria
EMAIL: nikolart@uni-sofia.bg (N. Totev); robev@fmi.uni-sofia.bg (B. Robev)
ORCID: 0000-0001-9220-0322 (N. Totev); 0000-0002-2705-3941 (B. Robev)

            © 2022 Copyright for this paper by its authors.
            Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
            CEUR Workshop Proceedings (CEUR-WS.org)
termining their application in industrial automation to non-industrial automation
as sufficient to classify a robot as an industrial or service robot.
     The main difference between service and industrial robots is the area of their
application and service domains, and the closeness to the end user [2]. Because
service robots operate much closer to the end user, there are some important
aspects of the design such as safety, reliability, cost, appearance, and user inter-
face [3] that need to be considered. Considering this classification, service robots
mainly provide services for professional and non-professional usage, namely cat-
egorization by application criteria. Second type categorization of service robots is
by their movement on a different surface: ground and hard surface, water surface,
aerial movement, and others [8, 9].
     Another important aspect of service robots is that they work in unstructured
and uncertain environments. This means service robots should be designed to
safely handle uncertainty while operating. One such uncertainty is position esti-
mation/localization, as the proper localization of service robots at any moment
of their operation is essential for the quality of service provided by them. Inac-
curate estimation of the position of service robots will cause loss of tracking
and create uncertainty in the robot’s activities. These inaccuracies in the position
estimation of mobile robots can be caused by some kind of skidding caused by
insufficient traction between the wheels and the terrain the robot is operating on,
or unexpected lateral shifts or rotations caused by external forces. If not detected
and corrected these disturbances will lead to erroneous localization data, which
in turn will affect the mapping data in a negative way.
     The MIRACle project (https://miraclebg.com) Intelligent Urban Environ-
ment Laboratory (IUE-Lab) concept is based on deployment service robot’s
models for mass usage using registering, analysis and storing observation data
from environments, where the robots are operating. The goal is to collect and
process all receiving data from intelligent sensors located on the service robot’s
body, in order an appropriate model for autonomous service to be delivered to
the consumers. In this case, the integration between sensors and pure mechanical
part could be used in one of the functional sub-groups of IUE, named Intelligent
Home Environment (IHE).
     Deployment of IHE are based on layered structure, including: infrastructure,
data and service layers. This paper proposes a localization and position estima-
tion approach for skid detection, that uses multiple different sensors to detect and
compensate for the unexpected skidding movements described. The proposed ap-
proach includes design system architecture for mobile service robot with optical
flow sensor, IMU sensor and encoders; probabilistic method for localization and
positioning; localization and data flow model for future deployment in IHE.




                                        204
2. Localization methods
     This chapter focuses on the localization methods and how different sensors
can come together to create a robust approach. The different sensors discussed in
this paper are optical flow sensors, IMU modules, and encoders. A further model
and analyzing how the strongest features of each sensor can be used to create a
robust localization model are covered in later chapters.
     Accurate position estimation is a key component to the successful operation
of most autonomous mobile robots. In general, there are three phases that comprise
the motion of a mobile robot: localization, path planning, and path execution.[4]
     Two main methods are used in mobile robot localization, based on two dif-
ferent perspectives: probabilistic methods and autonomous map building meth-
ods. The first method uses the Kalman filter (KF) algorithm, It requires the pres-
ence of an environment map, as in most cases, the map is predefined, including all
surrounding objects, obstacles, walls, and trajectories of the robot’s movement.
In a non-dynamic environment, the localization could be simple as the robot is
the only object moving on the surface. The information acquired from mobile
robots is combined together to estimate the actual position and localization of the
robot, as achievement of their optimal estimation will avoid mobile robots posing
uncertainty. In this regard, the algorithm allows combining uncertainties
     The second method, simultaneous localization and mapping (SLAM), use
Extended Kalman Filter (EKF) algorithm for automatic map construction during
mobile robot localization. In a dynamic environment, localization of the robot is
more difficult, because the robot movement depends on the movement of other
moving objects surrounding the robot. In such cases, the configuration of the
environment changes due to the presence of dynamic objects or explicit modifi-
cation. This problem is solved by automatic map building: the robot starts navi-
gating from a random initial point. It explores the environment autonomously
through its sensors. Further, it gains environmental knowledge and builds a map
by interpreting the scene. This helps the robot to localize relative to the map [5].
SLAM also uses a probabilistic method, where the robot’s position and robot map
is estimated.

3. System architecture
     Following the definition and layered structure of IHE, the infrastructure layer
is based on integrated sensors, local data concentrators, server and/or data col-
lecting and processing equipment and special actuators, depending of the purpose
of service robots.
     The proposed approach uses 3 different sensors to perform position estima-
tion. Along with these sensors, a Raspberry PI is used to process camera data by


                                        205
performing an optical flow algorithm and store sensor data from the encoders and
IMU a block schema of the platform architecture can be seen in Figure 1.
     The encoders are based on an IR reflectance sensor and are mounted near the
wheels. Slotted discs are used to interrupt the IR beam, this generates a square
wave that encodes rotation. A more advanced version of this uses 2 IR sensors to
determine rotation direction.
     The LSM6DS33 IMU module contains an accelerometer and gyroscope and
tracks the rotation and acceleration along the X Y and Z axis.
     The Arduino UNO is used to read data from the encoders and IMU in real-
time and sends them to the raspberry pi for processing.
     The data from the sensors can be exported in .csv format and can contain
rotation, speed, and position data at discrete timesteps.
     This is a standard architecture for most robots. Real-time sensors such as
IMU modules and encoders are controlled by a microcontroller instead of a sin-
gle-board computer like the Raspberry Pi. The main advantages are modularity,
and ease of development, meaning that the modules can be developed separately
and then integrated.




Figure 1: Platform architecture


                                      206
4. Localization and positioning model, dataflow charts, and
results
     The localization and positioning model definition in this paper is based on
a three-layered model, including an input data layer, processing data layer, and
results representation, including calibration of sensors, when appropriate.
     As was mentioned in the previous section we assume, that there are several
different sensors, i.e. IMU, encoders, and optical IR/laser optical sensor. They
are providing input data of the current position and location of the service robot.
Data acquisition is applied to obtain all data from these sensors, mounted on the
appropriate place of the service’s robot body. An Arduino UNO is used to get data
from the IMU and encoders. This collected data is sent for processing using the
Raspberry PI module, connected to the Arduino UNO module. The data could use
raw format or CSV format.
     For processing, input data could be used Matlab functions, including graph
representation of the results. The Kalman Filter algorithm is used to calculate
the service robot position on two dimensions – position and orientation estima-
tion. The optical velocity estimation, using optical sensors, as mentioned, could
use more than one optical sensor, in order to improve service robot accuracy. In
regards to such improvement, it will be also beneficial to include in estimation
the time variable, in order to achieve proper estimation on the position leveling,
based on the motion of the service robot.
     The model described is graphically represented in Figure 2.




Figure 2: Localization model




                                       207
Figure 3: Localization data flow

     Based on the model mentioned above, and KF algorithm, we present a data
flow model for the localization of robots in four steps:
     • The first step is to obtain data from wheel encoders and pre-defined maps.
     In this step also a data from Arduino UNO, data from the camera, and optical
     flow is gathered and presented for measuring
     • The second step produces a localization prediction of the robot, based
     on measurement prediction that comes from a predefined map and wheel
     encoder;
     • On the third step, a matching between measured information and ob-
     served information, comes from a camera, optical flow sensor, or any other
     sensor attached to the robot. In this step, the robot processes both dataflows
     and the outcome is the best match between both features;
     • In the estimation step, the KF method fuses the matching information in
     order to update the robot’s position.
     As mentioned above, the robot’s localization estimates the robot position
from the environment map. The overall localization model and data flow create a
more exact environment map from the information on the robot’s accurate path.
In this way, the proposed model utilizes the data collected from the different types
of robot’s sensors, comparing it with a predefined map data, and predict accurate
robot path. Aggregated data are based on the current positioning of the robot,
estimated from the surrounding environment, predefined maps, boundaries and
sensors.

                                        208
     In dynamic environments, if the SLAM method with EKF algorithm is used,
it will allow the robot to update existing or create new maps, while it is moving
and closely tracking and observing all objects around.
     The expected results of this method should be visualized graphically and in a
table format. In this way it will be easy for operators to calibrate, where possible,
the sensors.
     The final setup must be deployed on the expert system (middleware), which
manages service robots. Also, to validate the described localization method, the
next step is an experiment to be planned and conducted. In this way, the concept
of the localization model could be proved.

5. Conclusions
     The combination of optical sensors, IMU and encoders in service robots
could bring better robustness and accuracy of the service robots’ positioning and
localization. All sensors will track and measure the overall velocity, angular ve-
locity, orientation, and positioning, as well as the acceleration of the robot. A
localization model is based on the Kalman filter algorithm for a mobile robot that
utilizes mentioned sensors.
     Proposed approach covers two of the three prototype development layers of
IHE concept: infrastructure layer and data processing and management layer. The
third, service layer depends of the application, where the service robots will oper-
ate. Considering IHE as an operational environment and reviewing the mitigation
or avoiding the uncertainty of position estimation/localization of the service ro-
bots by using KF algorithm, the robots are capable to perform tasks and activities
in home environment. The proper localization of service robots at any moment of
their operation gives opportunity of mounting additional equipment (or sensors,
or actuators) on them, with ability to perform secondary tasks. There are lot of
examples, including cameras (for security and/or monitoring purposes), small
carry-on platforms (moving objects from location A to B), robotic hand with fin-
gers to press buttons and/or perform tasks on different height etc.
     The results of the proposed approach including the KF algorithm and rep-
resented model could be validated and verified in several experiments. The vali-
dation list shall include key indicators of accuracy of localization and position-
ing by detecting the correct robot path and avoiding uncertainty of any skidding
caused by different obstacles. Another validation is to key indicators of accuracy
of performance and number of sensors depending on the service robot size and
its application.




                                        209
6. Acknowledgments
     This paper is prepared with the support of MIRACle: Mechatronics, Innova-
tion, Robotics, Automation, Clean technologies – Establishment and develop-
ment of a Center for Competence in Mechatronics and Clean Technologies –
Laboratory Intelligent Urban Environment, funded by the Operational Program
Science and Education for smart growth 2014–2020, Project BG 05M2OP001-
1.002-0011.

7. References
[1]   International Federation of Robotics (2015a) Definition of service robots,
      URL: http://www.ifr.org/service-robots. Accessed 17 Feb 2015.
[2]   Sprenger, M., Mettler, T. Service Robots. Bus Inf Syst Eng 57, 271–274
      (2015). https://doi.org/10.1007/s12599-015-0389-x.
[3]   Kawamura, Kazuhiko, et al. “Design philosophy for service robots.” Ro-
      botics and Autonomous Systems 18.1-2 (1996): 109–116.
[4]   International Organization for Standardization, Robotics — Application of
      ISO 8373:2021, 2021, URL: https://www.iso.org/obp/ui/#iso:std:75539:en.
[5]   Prabin Kumar Panigrahi, Sukant Kishoro Bisoy, “Localization strategies
      for autonomous mobile robots: A review”. URL: https://doi.org/10.1016/j.
      jksuci.2021.02.015.




                                      210