=Paper= {{Paper |id=Vol-1710/paper18 |storemode=property |title=The 360° Around View System for Large Vehicles, the Methods of Calibration and Removal of Barrel Distortion for Omnidirectional Cameras |pdfUrl=https://ceur-ws.org/Vol-1710/paper18.pdf |volume=Vol-1710 |authors=Aleksey S. Makarov,Marina V. Bolsunovskaya |dblpUrl=https://dblp.org/rec/conf/aist/MakarovB16 }} ==The 360° Around View System for Large Vehicles, the Methods of Calibration and Removal of Barrel Distortion for Omnidirectional Cameras== https://ceur-ws.org/Vol-1710/paper18.pdf
    The   360◦ around view system for large vehicles,
 the methods of calibration and removal of barrel

          distortion for omnidirectional cameras.


                       Makarov A.S. and Bolsunovskaya M.V.
    Peter the Great St. Petersburg Polytechnic University, Saint-Petersburg, Russia



        Abstract.    Driving fraught with an uncontrolled risk of small collisions
        with other vehicles and pedestrians because of "blind spots": at parking,
        at exit from a parking lot, while driving in heavy trac of vehicles. For
        trucks the task of creating 360◦ around view systems complicated by
        large size, small overview of a cabin and the presence of turning trailers,
        and at the moment the sort of system is not produced. The 360◦ around
        view system being developed will solve specied problems, getting im-
        ages from 4 cameras and displaying a single image of a vehicle and its
        environment as a "bird's eye" on a monitor in real time. The article de-
        scribes the existing 360◦ around view system for passenger cars, given
        their advantages and disadvantages, describes the current development
        by Volvo Trucks. The article describes the proposed approaches for cal-
        ibrating the camera, removing of barrel distortion. The article contains
        results of calibration and removal of barrel distortion for the camera
        180degree Fisheye Lens 1080p Wide Angle Pc Web USB Camera.

        Keywords:   barrel distortion, camera calibration, 360◦ around view sys-
        tem, large vehicles, omnidirectional camera.


1     Introduction


     Driving fraught with an uncontrolled risk of small collisions with other vehi-
cles and pedestrians because of "blind spots": at parking, at exit from a parking
lot, while driving in heavy trac of vehicles. For trucks the task of creating 360    ◦

around view systems complicated by large size, small overview of a cabin and
presence of turning trailers, and at the moment such a system is not produced.
    Every year on Europe's roads in accidents involving trucks more than 7 thou-
sand people dies and about 100 thousand receive damage [1]. The European
Commission and the International Road Union (IRU) conducted a study, which
found that about 75% of road accidents involving trucks happen because of
"blind spots" [2]. The "blind spots" are the cause of:
    1. 5% of accidents in which the victims are drivers and passengers of cars
(clash of sides at evolution).
    2. 35% of accidents in which the victims are unprotected road users (pedes-
trians, cyclists and motorcyclists).
    Scientic and technical problems which are solved by our work are the de-
velopment of software and hardware for the 360 around view system of large
                                                   ◦

vehicles, including algorithms and mathematical models to display vehicle's en-
vironment in real-time as bird's-eye view on the monitor. The result of the work
will be hardware-software complex consisting of 4 (or more, depending on the
results of experiments) wide-angle sh-eye camera, hardware unit, monitor, inter-
face and software that uses developed video processing algorithms. This complex
will be display a single image of a vehicle and its environment as a "bird's eye"
on a monitor in real time, getting the picture from 4 cameras. The system will
be able to estimate the distance to objects in the environment, and warns of the
dangers of proximity. In addition, it will include the parking assist system with
"parking lines".
    The advantages of the complex will be using it at speeds higher than in
the existing analogues, working at low light levels and applying for small and
for large vehicles. In addition to dimensions, large-sized vehicles are special by
rotary parts, like trailers of trucks and articulated buses. The complex being
developed will take into account this feature.
2   The existing       360◦ around view systems

     Currently there are systems such as being developed, but only for passenger
(small-sized) cars. In 2007, the rst such system Around View Monitor was
presented by Nissan Motor Co., Ltd ,[3]. Around View Monitor synthesizes the
look of the car and its environs from a bird's ight with four ultra-wide-angle
(180 degrees, only the side), high-denition cameras and displays on the 5-inch
monitor in the center of the dashboard[4]. There is a considerable deformation on
the sides due to the cameras are located at less than 4 feet from the ground and
should cover more than 16 feet from the car[5]. The screen displays the view from
above on the left, and a front or rear view on the right. Views of the front and
rear are interleaved according to the gear changes[3]. In addition, the sensors,
installed on all four corners of the car, displays distance to an obstacle in easy-
to-understand color graphics (parking lines) and warns the driver through the
sound signal of the approaching vehicle to immovable something[6]. AVM works
in all conditions at speeds up to 10 km / h. AVM is automatically activated when
the position of the transmission selector is reverse or when car moves forward
after clicking on the Camera button under the display[5].
    Besides Inniti and Nissan, such automakers as Audi, BMW, Lexus, Mer-
cedes Benz and Toyota have their own 360 around view systems. Almost all of
them co-operate with external chip manufacturers such as Freescale Semicon-
ductor, whose industry-specic solutions can be easily congured to work with
the equipment of any automakers[7].
    Some automakers also oer a wide and ultra-wide front and rear view, for
example, such as BMW SUVs. Land Rover oers a technology Transparent Bon-
net, which allows "to see through the hood", when you're climbing over the hills
and other obstacles. In addition, it is possible to add a fth camera at the top
of the liftgate for a less distorted rear view[6].
    VW also oers some additions in its technology Area View[8]:
    1. Unlike conventional perspective bird's-eye view, the surrounding area is
projected onto the hemisphere.
    2. The system has 17 dierent virtual camera positions. It is concentrated
so that it guarantees the view of the vehicle and the area around it in every
conceivable perspective.
    3. While driving at low speed over rough terrain or in poor underfoot condi-
tions front camera mode "Oroad", which provides the ability to recognize such
obstacles as large rocks, stumps or holes, is available.
    Research works on designing of the 360 around view system for cargo (large-
                                          ◦

sized) car launched in 2010 by Volvo Trucks, the completion of works is planned
by 2020[9]. The "heart" of the system is a digital platform, which retrieves data
from cameras, radars and other sensors located on the perimeter of the vehicle.
Moreover scanning is performed every 25 milliseconds, and the eld of vision of
the complex is 360 degrees, which provides all-round visibility[10].
    According to the report [11] of developers from 17 March 2014 the project
was in the development stage. At this stage, they developed a functioning in-
dependent system searching for threats and decision-making. Studies of driver
behavior on the basis of historical data set to determine his passivity/activity
are under way. Volvo developed Fusion sensor that allows to merge the images
on a low level, and to consider sensor system as a single.
    During our research of the existing analogs, we have identied some of their
shortcomings.
    In systems for small vehicles :
    1. Remain blind spots .
    2. The distortion in the periphery of camera visibility => The distortion of
the form and the size of objects.
    3. Its work at low vehicle speeds.
    For large vehicles are added :
    1. Large size => Increasing the amount of blind spots .
    2. Rotary trailers => Increasing complexity of image stitching.
    To solve the problems with the presence of blind spots and distortion, we
decided to use ultra wide-angle sh-eye camera. Description of the methods for
its use described in the next section.
3   Calibration and removal of barrel distortion


    Since the main task is to obtain a picture of the car from the top, and the
system being developed uses a wide-angle camera such as sh-eye (Fig. 1), then
we need to get from each of 4 cameras its orthographic view from above[12].
   In its turn this procedure includes three sub-procedures:
   1. Camera calibration.
   2. Getting an image without barrel distortion using the calibration results.
   3. Getting orthographic projection.




         Fig. 1. A front view of a vehicle, obtained from the sheye camera.


    In the study, we analyzed the following methods and algorithms:
    1. Algorithm by Luis Puig and others[13]. It is based on the use of DLT
(Digital Linear Tape) and is used for catadioptric cameras. This method uses a
three-dimensional model of a calibration chessboard.
    2. Algorithm by Joao P. Barreto and Helder Araujo[14]. The method is based
on special properties of conical curves that allows to calibrate the camera using
at least three straight lines. It is used for catadioptric cameras.
    3. Algorithm by Christopher Mei and Patrick Rives[15]. It is used for omnidi-
rectional cameras. Based on the generalized matrix of the projection of sh-eye
camera. For the calibration a few pictures of the chessboard is used.
    4. Algorithm by Ellen Schwalbe[16]. This method is used for the sheye cam-
eras. It is based on their property that there is approximately linear relationship
between the angle of incidence of the beam to the point of the object in the im-
age and the distance to it. Calibration is carried using a set of points distributed
around the room.
    5. Algorithm by Tsudikov M.B.[17]. It uses the same properties sheye cam-
era, as the previous algorithm, but it is used to get rid of the distortion.
    6. Algorithm by Biryukov E.D.[18]. This method allows to obtain "top view"
for fragments of images, located away from the optical axis of the camera. It is
based on the construction of a grid at a fragment and applying to it the linear
transformations that result is a rectangular area. The author acknowledges that
barrel distortion is not completely removed.
    7. Algorithm by Davide Scaramuzza.
    The method of camera calibration by Davide Scaramuzza was selected on
the basis of the comparative analysis and taking into account the data of the
research presented in [19]. The advantage of this method is the absence of using
of the parametric model, which individual for each specic camera that allows
unication. To determine the distance to the point of the space in the image,
this approach uses a function based on a Taylor series:
    f (u , v ) =√a + a ρ + a ρ + ... + a ρ ,
       00   00                 00         002            00N

    where ρ = u + v , u and v  image point's coordinates in the plane
                    0      1          2              N
                     002       002   00         00

of the camera, a  scaling factor.
    This method is described in more detail in [20].
    The methodology is implemented as a tool "OCamCalib" to the MAT-
LAB[21]. This tool is used by organizations such as NASA, PHILIPS, BOSCH,
DAIMLER. For the calibration we used a set of images of "chessboard" (Fig.
2). After downloading the images (minimum - 9) and entering the dimensions of
the calibration board, the program automatically determines the contact points
of angles of the grid. In this paper we use the board size of which is 9x7 cells
with an edge length of 23 mm.




                 Fig. 2. Snapshots of the chessboard used for calibration.


    At the next stage center of the grid is calculated, and on the basis of all
available data, distortion of the board and the distance to all corners of the grid
with a margin error is calculated.
    Because calibration accuracy is very important for our system, we determined
on the basis of experiments the conditions under which its results the most
adequate. The intermediate results of the algorithm, as the average reprojection
error (hereinafter "err") of grid cells (that is the dierence between the calculated
and the real size, position and shape of the cells), were used to determine the
quality of calibration and its dependence on parameters such as:
   1. The number of input images.
   2. The number of cells of the calibration board.
   3. The degree of illumination.
   Results are shown in Tables 1-3.
 Table 1. Dependence of the calibration results on the number of calibration images.

        Number of images 3        4    5 6        7     8     9    10 11
              err       30.593 16.547 9.9 3.065 1.063 0.896 0.478 0.355 0.32


   Depending on the requirements to the value of the error it is possible to
determine the minimum of required number of images (Table 1).
Table 2. Dependence of the calibration results on the number of cells of the calibration
board.

                           Board size 5x7 6x8 7x9 8x10
                              err    1.489 0.76 0.478 0.389


   The experiment showed that the number of grid cells does not signicantly
aect the results of the calibration (Table 2).
    Table 3. Dependence of the calibration results on the degree of illumination.

Variant of illumination daylight room light table lamp room light + backlighting by a lantern
          err            11.390    1.879       5.289                   0.478


    As seen from the experimental results, the method is sensitive to the degree of
illumination of the space and to the calibration board (Table 3). This dependence
the results of the lighting is due by harshness the image at a lower level of
illumination, that is the dierence between the white and black cells is reduced.
    The result of the calibration is the above function, or rather its coecients,
which are used in the second sub-procedure. It was also performed using the
tools Davide Scaramuzza, but already implemented as an application in C ++
using library OpenCV.
    Since we develop the 360 around view system, as the input image used image
(Fig. 3), on which the calibration board occupied area, presumed to output
on the monitor. The result of the program was corrected image without barrel
distortion. Fig. 4 shows that distortion remained insignicant (indicated by
green lines).
                 Fig. 3. The original picture with barrel distortion.




              Fig. 4. The corrected picture without barrel distortion.


4   Conclusions and further research.


    The paper presents the research and analysis of the existing 360 around view
systems review for cars. It describes their strengths and weaknesses.
    Also we analyzed the algorithms of omnidirectional camera calibration, and
algorithms for removal of distortion. Based on the methodology Davide Scara-
muzza we obtained calibration data for the camera 180degree Fisheye Lens 1080p
Wide Angle Pc Web USB Camera. The paper shows the results of the calibration
for various parameters. We obtained images without barrel distortion.
    Results of the rst two stages will be used at the next stage, while receiving
an orthographic top view. Research will be focused on the development of the
algorithm, which allows to get a horizontal projection of the front view (and
others) of the car. The constancy of this area makes it easier and allows making
calculations once and further applying them. We are going to calculate the de-
pendence of the conversion results on the height of the camera above the ground
surface and angle of tilt to it. The quality of the results will be checked by
comparison with photo above of the eld.
References


1. Why trucks get into accidents? [Pochemu gruzovye avtomobili popadaiut
   v avarii?] // TRANSPORTAL. Available at: http://www.transportal.by/
   obsujdaem/pochemu-gruzovye-avtomobili-popadayut-v-avarii.html (in Rus-
   sian)
2. What is a blind spot and how to protect from its? [Chto takoe mertvaia zona i kak
   obezopasit'sia?] // Gazu.ru. 2012. Available at: http://www.gazu.ru/safety/bdd/
   10225/ (in Russian)
3. Nissan releases details about "Around View Monitor" // autoblog. 2007.
   Available at: http://www.autoblog.com/2007/10/12/nissan-releases-details-
  about-around-view-monitor/
4. Around View Monitor.// Nissan. Available at: http://www.nissan-global.com/
  EN/TECHNOLOGY/OVERVIEW/avm.html
5. The 360◦ around view system by Nissan: the world around you. [Sistema krugovogo
   obzora Nissan: mir vokrug tebia.] // CARSGURU. 2012. Available at: http://
   carsguru.net/articles/252/view.html (in Russian)
6. Howard, B. What are car surround view cameras, and why are they
   better than they need to be? // ExtremeTech. 2014. Available at:
  http://www.extremetech.com/extreme/186160-what-are-surround-view-
  cameras-and-why-are-they-better-than-they-need-to-be
7. Howard, B. Nissan Pathnder 2013 review: 4 cameras, 360-degree cov-
   erage, no-crunch parking. // ExtremeTech. 2012. Available at: http:
   //www.extremetech.com/extreme/140942-nissan-pathfinder-2013-review-
   4-cameras-360-degree-coverage-no-crunch-parking
8. Area View. // The Ocial Website for Volkswagen UK. Available at: http://www.
   volkswagen.co.uk/technology/proximity-sensing/area-view
9. VOLVO TRUCKS ANNOUNCED THE 360◦ AROUND VIEW SYSTEM.
   [VOLVO TRUCKS ANONSIROVALA SISTEMU KRUGOVOGO OBZORA.] //
   MILAN GROUP. 2014. Available at: http://milan-group.ru/volvo-trucks-
   anonsirovala-sistemu-krugovogo-obzora/ (in Russian)
10. New technology enables all-around view. // VOLVO TRUCKS KUWAIT.
   2014.Available at: http://www.volvotrucks.com/trucks/kuwait-market/en-kw/
  newsmedia/pressreleases/Pages/pressreleases.aspx?pubID=18280
11. Almevad, A. Final Report Non Hit Car And Truck 2010-2013. // VINNOVA.
   2014. Available at: http://www.vinnova.se/PageFiles/751290059/2010-01148_
  publikrapport_EN.pdf
12. Valiev, I. V. and Voloboi, A. G., 2010, Modeling of 360◦ around view monitor.
   [Modelirovanie monitora krugovogo obzora.] Trudy 20-oi mezhdunarodnoi konfer-
   entsii po komp'iuternoi grake i zreniiu GRAFIKON-2010, pp. 269-272. (in Rus-
   sian)
13. Puig, L., Bastanlar, Y., Sturm, P., Guerrero, J. J. and Barreto, J., 2011. Cali-
   bration of central catadioptric cameras using a DLT-like approach. International
   Journal of Computer Vision, 93(1), pp.101-114.
14. Barreto, J. P. and Araujo, H., 2005. Geometric properties of central catadioptric
   line images and their application in calibration. Pattern Analysis and Machine In-
   telligence, IEEE Transactions on, 27(8), pp.1327-1333.
15. Mei, C. and Rives, P., 2007, April. Single view point omnidirectional camera cal-
   ibration from planar grids. In Robotics and Automation, 2007 IEEE International
   Conference on (pp. 3945-3950). IEEE.
16. Schwalbe, E., 2005, February. Geometric modelling and calibration of sheye lens
   camera systems. In Proc. 2nd Panoramic Photogrammetry Workshop, Int. Archives
   of Photogrammetry and Remote Sensing (Vol. 36, No. Part 5, p. W8).
17. Tsudikov, M. B., 2011. Reduction of the image from the type chamber "Fish eye"
   to the standard television. [Privedenie izobrazheniia iz kamery tipa "Rybii glaz" k
   standartnomu televizionnomu.] Izvestiia Tul'skogo gosudarstvennogo universiteta.
   Tehnicheskie nauki, (5-1). (in Russian)
18. Biriukov, E. D., 2015. Image correction algorithm from the wide-angle camera of
   car rear view. [Algoritm korrektsii izobrazheniia s shirokougol'noi kamery zadnego
   vida avtomobilya.] Novye informatsionnye tekhnologii v avtomatizirovannykh sis-
   temakh, (18). (in Russian)
19. Lazarenko, V. P., Dzhamiykov, T, S., Korotaev, V. V., Yaryshev, S. N. Transfor-
   mation algorithm for images obtained by omnidirectional cameras. [Algoritm pre-
   obrazovaniia izobrazhenii, poluchennyh vsenapravlennymi optiko-elektronnymi sis-
   temami.] Nauchno-tekhnicheskii vestnik informatsionnykh tekhnologii, mekhaniki i
   optiki 15, no. 1 (2015): 82-90. (in Russian)
20. Scaramuzza, D., Martinelli, A. and Siegwart, R., 2006, January. A exible tech-
   nique for accurate omnidirectional camera calibration and structure from motion.
   In Computer Vision Systems, 2006 ICVS'06. IEEE International Conference on (pp.
   45-45). IEEE.
21. Scaramuzza, D. OCamCalib: Omnidirectional Camera Calibration Toolbox for
   Matlab. Available at: https://sites.google.com/site/scarabotix/ocamcalib-
   toolbox