=Paper= {{Paper |id=Vol-3349/paper5 |storemode=property |title=3D Trajectory Registration for Sensor Calibration |pdfUrl=https://ceur-ws.org/Vol-3349/paper5.pdf |volume=Vol-3349 |authors=Tekla Toth,Gabor Valasek,Levente Hajder |dblpUrl=https://dblp.org/rec/conf/cvww/TothVH23 }} ==3D Trajectory Registration for Sensor Calibration== https://ceur-ws.org/Vol-3349/paper5.pdf
3D Trajectory Registration for Sensor Calibration
Tekla Tóth1,*,† , Gábor Valasek1,† and Levente Hajder1,†
1
    Department of Algorithms and Their Applications, Eötvös Loránd University, Pázmány Péter s. 1/C, 1117 Budapest, Hungary


                                             Abstract
                                             Automatic sensor calibration is ubiquitous for robots and autonomous vehicles. Traditional target-based calibration methods
                                             typically require multiple registrations of an object in the field of view. However, by moving the object continuously while
                                             recording, the 3D positions of the target generate a path that may improve the calibration quality.
                                                 In this work, we propose a novel curve-based trajectory registration method to enhance calibration accuracy with spherical
                                             and checkerboard targets. This approach supplements the existing target position point set with estimated intermediate
                                             points after applying spline fitting techniques to the measurements. The curvature of the movement is aligned between the
                                             point sets resulting in reduced error between the registered point clouds and more accurate extrinsic calibration parameters.
                                             We validate these properties in indoor and outdoor real-world scenarios.

                                             Keywords
                                             sensor calibration, 3D trajectory, movement estimation, interpolation, pointset alignment



1. Introduction                                                                                                                       each sensor device. Then we use point cloud registration
                                                                                                                                      methods [5, 6, 7] to compute the rigid body motion be-
The alignment of two or more point sequences occurs                                                                                   tween the devices. This requires sufficiently large input
in various computer vision, computer graphics, and sur-                                                                               data sets.
face modeling problems [1]. Generally, state-of-the-art                                                                                  Our main insight is to expand the point clouds by
target-based sensor calibration can estimate the extrin-                                                                              inferred trajectory points. These are obtained by fitting
sic parameters between e.g., a LiDAR sensor, and a dig-                                                                               an interpolating spline to the captured trajectory data
ital camera from a small number of input data pairs.                                                                                  and evaluating this spline at a user-adjustable frequency
Both the sphere-based method of Tóth et al. [2] and                                                                                   that is higher than the sampling rate. This assumes a
the checkerboard-based calibration of Zhou et al. [3] ap-                                                                             temporal and spatial coherence between the calibration
ply point set registration on some detected feature points                                                                            images and we show that it improves calibration accuracy.
to find the sought parameters. The methods can find                                                                                   The proposed process is visualized in Fig. 1. We evaluate a
the positions of a sphere center or the edges and corners                                                                             spherical realization of the proposed method empirically
of a checkerboard pattern in 3D from the sensor data                                                                                  for camera-LiDAR calibration in Section 3.
with different modalities. The camera and the LiDAR                                                                                      The motivation for our work was to improve the multi-
take snapshots simultaneously while the target is moved                                                                               modal sensor calibration of a test vehicle (see Fig. 3) with
around in the scene. In the final step, the original pipeline                                                                         a reduced camera data rate. In particular, our setup pro-
of these methods applies the point pair registration (PPR)                                                                            vided a 2-4 FPS (frames per second) video stream which
of Arun et al. [4]: the detected pairwise positions are                                                                               resulted in a sparse trajectory sampling with about 10-
registered. This approach can be applied both between                                                                                 20 centimeters between consecutive points. An example
planar and spatial point sets.                                                                                                        image sequence is presented in Fig. 2. Our idea is to ap-
   Our goal is to improve object-based calibration pro-                                                                               proximate the trajectory with curve segments to get a
cesses via the directed continuous movement of the target.                                                                            more detailed path containing synthetic positions. Dur-
Instead of treating each calibration image pair indepen-                                                                              ing the measurement, synchronization errors occur due
dently and using point pairwise registration on them, we                                                                              to the LiDAR turnaround time. This means that the end-
process the estimated trajectory points of the human-                                                                                 points of the paths are not necessarily aligned, but the
controlled movement (see in Fig. 2) as point clouds for                                                                               common parts of the curves are, as much as possible.
26th Computer Vision Winter Workshop, Robert Sablatnig and Florian                                                                    Regardless of the endpoint misalignments, the interior of
Kleber (eds.), Krems, Lower Austria, Austria, Feb. 15-17, 2023                                                                        the path itself may be robustly inferred; therefore, we can
*
  Corresponding author.                                                                                                               interpolate the curve between the measured data points
†
  These authors contributed equally.                                                                                                  and add some synthetic intermediate points. This tech-
$ teklatoth@inf.elte.hu (T. Tóth); valasek@inf.elte.hu (G. Valasek);                                                                  nique can be also applied in the case of surfaces defined
hajder@inf.elte.hu (L. Hajder)
€ http://cv.inf.elte.hu/ (L. Hajder)
                                                                                                                                      by sparse point set [8].
 0000-0002-7439-0473 (T. Tóth); 0000-0002-0007-8647
(G. Valasek); 0000-0001-9716-9176 (L. Hajder)
                                       © 2023 Copyright for this paper by its authors. Use permitted under Creative Commons License
                                       Attribution 4.0 International (CC BY 4.0).
    CEUR
    Workshop
    Proceedings
                  http://ceur-ws.org
                  ISSN 1613-0073
                                       CEUR Workshop Proceedings (CEUR-WS.org)




                                                                                                                                  1
Tekla Tóth et al. CEUR Workshop Proceedings                                                                                1–7




Figure 1: Schematic image of the proposed trajectory registration pipeline. The detailed process is discussed in Alg. 2.




Figure 2: An input image sequence from the camera. The sphere center moves around 10-20 𝑐𝑚 from one frame to the next
regarding these images.



                                                                   LiDAR saves the data packages of every 15° rotation slice.
                                                                   When it gets the trigger, it finishes the current chunk of
                                                                   data, labels it as the end of the scan to be saved, and iden-
                                                                   tifies the last 360° rotation as one complete synchronized
                                                                   turnaround to the current image. However, in the case
                                                                   of a moving object, the worst case is when the object
                                                                   appears at the beginning of the point cloud, which may
                                                                   cause an error 0.05 𝑠 at maximum. If the target object
                                                                   moves with a typical 0.2–0.4 𝑚/𝑠 velocity, it causes 1–2
                                                                   𝑐𝑚 error. Therefore, the point pairwise registration may
                                                                   be inaccurate, while the proposed solution tries to deal
                                                                   with this particular problem.


                                                                   2. Proposed method
Figure 3: Camera-LiDAR setup of the test vehicle for the     This section introduces the evaluated variations of the
real-world tests. During the experiments, one camera was     proposed pose estimation described in Fig. 1 using in-
calibrated to the LiDAR. Expanding the proposed approach,
                                                             terpolated points on the estimated trajectory and point
multi-sensor calibration can be applied for all five cameras
and the LiDAR.
                                                             cloud trajectory registration. Let 𝒫 ℒ and 𝒫 𝒞 denote the
                                                             point sets of estimated sphere centers from LiDAR data
                                                             and camera images, respectively. The [︀final ]︀task is to find
                                                             the rigid transformation matrix T = R|t constructed
1.1. Camera-LiDAR Synchronisation                            by a rotation matrix R and a translation vector t. In
We provide an overview on the synchronization system this paper, R and t describe the south rotation and
                                                                            ℒ𝒞       ℒ𝒞

between the camera and the LiDAR as shown in Fig. 4. A translation from the camera frame to the LiDAR frame.
Velodyne VLP-16 LiDAR device is spinning with a 1200 In the particular case of sphere-based estimation, 𝑛 ≥ 4
RPM (rotation per minute) while an Arduino Triggerbox point pairs are required.
triggers the camera with 4 FPS. If one trigger sign arrives,    First, let us consider the point pair solution as shown
the camera mounted on the same vehicle exposes for a in Alg.1, based on the method of Arun et al. [4]. The point
varying duration, at most for 0.04 seconds. Then the set registration is defined as a minimization problem:
picture is taken by a global shutter, and the system trig-                         𝑛 ⃦                     ⃦2
gers the LiDAR to save a synchronized turnaround. The
                                                                                  ∑︁  ⃦ ℒ         𝒞     ℒ𝒞 ⃦
                                                                         arg min      ⃦p𝑖 − Rp𝑖 − t ⃦ ,                  (1)
                                                                               Rℒ𝒞 ,t                             2
                                                                                        𝑖=1




                                                               2
Tekla Tóth et al. CEUR Workshop Proceedings                                                                                            1–7


                                                                              End of Saved
                                                       15° rot.                  Scan
                                      Saved                                                           Saved
    LiDAR (20 Hz)                  Synchronized
                                   Turnaround 1
                                                                             Global Shutter        Synchronized
                                                                                                   Turnaround 2




                                0.05 s
                                                   Global
                         (full turnaround)
                                                   Shutter

                                                    Frame 1                                                          Frame 2




                                                                                                                 2
                                               1
   Camera (4 FPS)                                   Readout                                                          Readout




                                                                                                             ure
                                           u re




                                                                                                             s
                                        os




                                                                                                          po
                                        p
                                     Ex




                                                                                                        Ex
                                     max. 0.04 s


   Trigger Box
                                 Trigger                            0.25 s                           Trigger                   t (s)
                                 Sign 1                                                              Sign 2

Figure 4: Hardware synchronization of the real camera-LiDAR system. The varying exposure time causes an unpredictable
noise which motivates us to find a more robust solution for the point set registration during the calibration process.



where pℒ 𝑖 and p𝑖 denotes the 𝑖-th point pair of the Li-
                  𝒞
                                                                            interpolation method and the sampling frequency to add
DAR and the camera data. The translation is eliminated                      new synthetic points to the point sets.
by choosing the point set barycenters as the coordinate                        To find the best method, the following requirements
system origins in the two images. The rotation can be                       must be met: the internal control points have to be in-
derived using singular value decomposition (SVD) on                         terpolated while the segments tightly follow the knots
the matrix H constructed from the point sets. If the                        without self-intersection between the points. Moreover,
point pairs are perfectly synchronized and noiseless, this                  the final segment sequence has to be continuous and easy
method guarantees optimal results. However, in the pres-                    to evaluate at any point.
ence of noise, such as due to synchronization errors as we                     Linear interpolation (LI). A naive solution is the
described in Sec. 1.1, Eq. (1) no longer provides optimal                   linear interpolation between the neighboring points of
solutions.                                                                  the measurement to reconstruct the path with line seg-
  The improvements we propose are described in Alg. 2.                      ments. Derivatives with a large magnitude of the curve
The initial step is the pre-filtering of the point sets be-                 are not expected. As rapid changes in the trajectory are
cause noisy data may occur. This is based on scaled                         not expected, the line-based approach is a good initial
median absolute deviation:                                                  step; however, its precision depends on the target object
                                                                            movement speed and complexity. If two adjacent control
         𝑚𝑖 = 𝑠 · median (|𝑥𝑖 − median(x)|) ,                     (2)       points are denoted by p𝑖 and p𝑖+1 where 𝑖 ≤ 𝑛 − 1, 𝑛 is
                                                                            the number of the points, and 𝑡 ∈ [0, 1] is an independent
where 𝑥𝑖 ∈ x is an array of data, and 𝑠 is the scale. As
                                                                            parameter, the interpolation is
the approach focuses on the trajectories and not on the
pairwise synchronized data, if any of the measurement
                                                                                          c(𝑡) = 𝑡 · p𝑖 + (1 − 𝑡) · p𝑖+1 .             (3)
data (either LiDAR or camera) is filtered out because of
noise, the corresponding data may remain in the other        Catmull–Rom spline (CRS). The second approach
point set, in contrast to the point pair registration. Then
                                                         is a curve-based solution applying cubic centripetal
the algorithm has 2 main steps: (i) estimating the tra-  Catmull–Rom spline [9]. The Catmull-Rom spline pa-
jectory of the movement and (ii) registering the point   rameterization produces curves that move toward the
clouds. The next subsections present all the realizationsnext control point and has small derivatives around
of these two steps that we evaluate in this paper.       control points. The advantages of the centripetal knot
                                                         parametrization are that it fulfills the aforementioned
2.1. Trajectory estimation                               requirements, e.g., no self-intersections occur [10] in
                                                         contrast to the uniform or the chordal parameteriza-
For trajectory estimation, the estimated 3D sphere posi- tion. The control points are denoted by p𝑖 , and knots
tions can be interpreted as control points. To model the 𝑡𝑖 , 𝑖 = 0, 1, 2, 3 are also predefined. The derivatives are
movement, the two main questions are the choice of the




                                                                        3
Tekla Tóth et al. CEUR Workshop Proceedings                                                                           1–7


Algorithm 1 Pose estimation with point pair registration – PPR [4]
Input: sphere centers from LiDAR data 𝒫 ℒ = {pℒ 𝑖 ∈ R | 𝑖 = {1, . . . , 𝑛}, 𝑛 ≥ 4}; sphere centers from camera
                                                       3

    images 𝒫 = {p𝑖 ∈ R | 𝑖 = {1, . . . , 𝑛}, 𝑛 ≥ 4}
              𝒞      𝒞     3

Output: rotation matrix Rℒ𝒞 ∈ R3×3 ; translation vector tℒ𝒞 ∈ R3×1
     ℒ𝒞
 1: t   :=∑︀𝒫 𝒞 − 𝒫 ℒ             ◁ The difference between the centers of gravity
             𝑛    ℒ    𝒞 𝑇
 2: H :=     𝑖=1 p𝑖 (p𝑖 )
 3: USV := SVD(H)
          𝑇
                                  ◁ Singular value decomposition of H
      ℒ𝒞
 4: R    := VU𝑇
              ℒ𝒞   ℒ𝒞
 5: return R , t


Algorithm 2 Pose estimation with point cloud registration
Input: sphere centers from LiDAR data 𝒫 ℒ = {pℒ   𝑖 ∈ R
                                                         3
                                                             | 𝑖 = {1, . . . , 𝑛}, 𝑛 ≥ 4}; sphere centers from
    camera images 𝒫 = {p𝑖 ∈ R | 𝑖 = {1, . . . , 𝑛}, 𝑛 ≥ 4}; 𝑑 > 0 sampling interval on the trajectory spline
                     𝒞     𝒞    3

    approximation
Output: rotation matrix Rℒ𝒞 ∈ R3×3 ; translation vector tℒ𝒞 ∈ R3×1
      ℒ   𝒞                ℒ  𝒞
 1: 𝒫 , 𝒫 := 𝑝𝑟𝑒𝑓 𝑖𝑙𝑡𝑒𝑟(𝒫 , 𝒫 )                    ◁ Based on scaled median absolute deviation in Eq. 2
      ℒ*
 2: 𝒫    := 𝑎𝑑𝑑𝐼𝑛𝑡𝑒𝑟𝑝𝑜𝑙𝑎𝑡𝑒𝑑𝑃 𝑜𝑖𝑛𝑡𝑠(𝒫 ℒ , 𝑑)        ◁ Linear interpolation, Catmull–Rom spline [9],
      𝒞*
 3: 𝒫    := 𝑎𝑑𝑑𝐼𝑛𝑡𝑒𝑟𝑝𝑜𝑙𝑎𝑡𝑒𝑑𝑃 𝑜𝑖𝑛𝑡𝑠(𝒫 𝒞 , 𝑑)        ◁ or Kochanek–Bartels spline [11]
                                   ℒ*     𝒞*
 4: t, R := 𝑟𝑒𝑔𝑖𝑠𝑡𝑒𝑟𝑃 𝑜𝑖𝑛𝑡𝐶𝑙𝑜𝑢𝑑𝑠(𝒫 , 𝒫 )           ◁ ICP [5, 6] or CPD [7]
             ℒ𝒞   ℒ𝒞
 5: return R , t



given as                                                            Kochanek–Bartels spline (KBS). The third trajec-
                         p𝑖+1 − p𝑖−1                             tory approximation curve we evaluate is the Kochanek–
                  m𝑖 =               .                 (4)
                         𝑡𝑖+1 − 𝑡𝑖−1                             Bartels spline [11] which is a generalized version of the
Based on the points p𝑖 and the derivatives m𝑖 calculated         Catmull–Rom spline extended by three parameters: the
by (4), the Hermite curve is fitted data pair-wise. The          tension 𝒯 ∈ [−1, 1], the bias ℬ ∈ [−1, 1] and the conti-
equation of the spline is written as                             nuity 𝒞 ∈ [−1, 1]. The derivative calculation is modified
                                                                 in the following way. The derivatives of the end-points
                      𝑡2 − 𝑡       𝑡 − 𝑡1                        of the 𝑖-th segments are as follows
            c(𝑡) =           b1 +         b2 ,         (5)
                     𝑡2 − 𝑡1      𝑡2 − 𝑡1
                                                                          (1 − 𝒯 )(1 + ℬ)(1 + 𝒞)
where                                                                m𝑖 =                        (p𝑖 − p𝑖−1 )+
                                                                                     2
                   𝑡2 − 𝑡       𝑡 − 𝑡0                                    (1 − 𝒯 )(1 − ℬ)(1 − 𝒞)
             b1 =         a1 +         a2 ,                                                      (p𝑖+1 − p𝑖 ),   (8)
                  𝑡2 − 𝑡0      𝑡2 − 𝑡0                                               2
                   𝑡3 − 𝑡       𝑡 − 𝑡1                                    (1 − 𝒯 )(1 + ℬ)(1 − 𝒞)
             b2 =         a2 +         a3 ,                        m𝑖+1 =                        (p𝑖+1 − p𝑖 )+
                  𝑡3 − 𝑡1      𝑡3 − 𝑡1                                               2
                   𝑡1 − 𝑡       𝑡 − 𝑡0                                    (1 − 𝒯 )(1 − ℬ)(1 + 𝒞)
             a1 =         p0 +         p1 ,                                                      (p𝑖+2 − p𝑖+1 ). (9)
                  𝑡1 − 𝑡0      𝑡1 − 𝑡0                                               2
                   𝑡2 − 𝑡       𝑡 − 𝑡1
             a2 =         p1 +         p2 ,                        The final curve segment can be defined as a cubic
                  𝑡2 − 𝑡1      𝑡2 − 𝑡1
                   𝑡3 − 𝑡       𝑡 − 𝑡2                           Hermite spline between the internal control points p𝑖
             a3 =         p2 +         p3 .                      and p𝑖+1 , for any 𝑡 ∈ [0, 1] as
                  𝑡3 − 𝑡2      𝑡3 − 𝑡2
The parameter 𝑡 ∈ [𝑡1 , 𝑡2 ] interpolates between p1                𝑐(𝑡) =(2𝑡3 − 3𝑡2 + 1)p𝑖 + (𝑡3 − 2𝑡2 + 𝑡)m𝑖 +
[︀and p2 . If]︀ the point coordinates are denoted by p𝑖 =                  (−2𝑡3 + 3𝑡2 )p𝑖+1 + (𝑡3 − 𝑡2 )m𝑖+1 .      (10)
   𝑥𝑖 , 𝑦𝑖 , 𝑧𝑖 , then the knot parameterization of CRS is
                                                                    Sampling interval. Regardless of the applied inter-
                     𝑡𝑖+1 = 𝑙𝛼 + 𝑡𝑖 ,                  (6)       polation method in the second and third steps of Alg. 2,
                                                                 the sampling interval along the trajectory approximation
where
                                                                 spline has to be given. The Euclidean distance between
                                                                 p𝑖 and p𝑖+1 points is denoted by 𝑑𝑖 . The line or curve
     √︀
 𝑙 = (𝑥𝑖+1 − 𝑥𝑖 )2 + (𝑦𝑖+1 − 𝑦𝑖 )2 + (𝑧𝑖+1 − 𝑧𝑖 )2 ,
                                                      (7)        segment c𝑖 (𝑡) interpolates the end points p𝑖 and p𝑖+1 .
and the scale 𝛼 = 0.5 describes the centripetal spline.          If 𝑑𝑖 > 𝑑, we add additional positions to the trajectory.



                                                             4
Tekla Tóth et al. CEUR Workshop Proceedings                                                                                 1–7


                                                        Point clouds after registration



  PPR [4]




  ICP [6]




 CPD [7]



Figure 5: 3D sphere center trajectories after registration with point pair registration (PPR), iterative closest point (ICP) and
coherent point drift (CPD) algorithms. In this case, no interpolated positions were added.



To this end, we evaluate the curve segment at every 𝑡
step distance of 𝑑/𝑑𝑖 , and insert those into the trajectory
path. This, however, yields trajectory point sets with
different sizes; hence, point pair registration cannot be
applied between them in the previous form.

2.2. Trajectory registration
After the spline fitting, the point pairs lost importance
due to the quasi-continuous point set and varying sam-
pling between the control points. Classical point cloud
registration methods can be applied to dense 3D paths.
   One of them is the iterative closest point (ICP) algo-
rithm [5, 6]. The revised version improved the efficient
closest point computation [6] using k-d tree. The other
one is the coherent point drift (CPD) algorithm [7]. We
tested the rigid version of the method.                   Figure 6: Estimated 3D sphere center trajectories before
                                                                   registration. The original calibration method provides sub-
                                                                   centimeter precise 3D positions.
3. Tests and results
We examined whether the registration results could be
more precise using the new approach, and what method is               The input of the algorithms is the 3D point sequences
the most suitable among the alternate versions. Further-           of sphere centers. The test input for illustration is visual-
more, we tested the parameter setting for the sampling             ized in Fig. 6 containing 30 positions per sensor which
interval. The tests were implemented in MATLAB. To                 means a 7.5 sec. long video sequence in an outdoor envi-
compare the results, we computed the root mean square              ronment in a parking lot. We processed this data feed as
error (ℛℳ𝒮ℰ) of the Euclidean distance between the                 written in [2]. The sphere centers from images were esti-
aligned point clouds. The sensors of the vehicle (see              mated by ellipse detection, and then projecting it into 3D
in Fig. 3 and the synchronization in Sec. 1.1) captured            using the known radius of the sphere. The LiDAR-based
camera-LiDAR video sequences to test real-world scenar-            sphere localization used a fix-point iteration method [12].
ios.                                                                  At first, we compared the trajectory registration meth-
                                                                   ods discussed in Sec. 2.2 without estimating the move-




                                                               5
Tekla Tóth et al. CEUR Workshop Proceedings                                                                                      1–7


    𝑑                                                                 Trajectory estimation



 0.1 𝑐𝑚




 0.5 𝑐𝑚




  1 𝑐𝑚




Figure 7: Estimated 3D sphere center trajectories after linear interpolation (LI), fitting Catmull–Rom spline (CRS), and
Kochanek–Bartels spline (KBS) with varying sampling frequency 𝑑. The error rate is displayed in Tab. 1.



 Interpolation      ℛℳ𝒮ℰ w.r.t. sampling interval 𝑑      (cm)
 method           0.1      0.25      0.5      0.75     1.00
 LI               0.53     0.53      0.55     0.58     0.61
 CRS [9]          0.63     0.63      0.64     0.67     0.70
 KBS [11]         0.65     0.65      0.67     0.69     0.72

Table 1
Test results of ℛℳ𝒮ℰ (in centimeters) for varying sampling
interval and trajectory interpolations give subcentimeter er-
rors for all three interpolation method. In case of sampling by
𝑑 ≤ 0.25 cm, the error rate and the point trajectory density
is acceptable.



ment and any interpolated points. In Fig 5, the results
for the test case in Fig. 6 are plotted. The ℛℳ𝒮ℰ of                  Figure 8: Test results of trajectory interpolations are applied
the ICP algorithm is 0.0155 𝑚 with 0.0114 𝑚 standard                  to show the estimated rigid transformation in a schematic
deviation while the PPR and CPD algorithms gave the                   figure. The viewing frustum of the camera is in the LiDAR
same results, 0.0110 𝑚 with 0.0083 𝑚 standard devia-                  reference frame with the different interpolation methods with
tion. Based on the experiments and the fact that applying             𝑑 = 0.25 𝑐𝑚 sampling. The real-world camera is the bottom
PPR with point clouds of different sizes is not feasible,             left one with perspective lenses in Fig. 3 on the left.
we recommend using the CPD.
   We analyzed the interpolation methods and whether
the 𝑑𝑖 -based sampling gives the sought ideal density. We             of LI is more accurate, but the rotation is expected as in
examined several setups in the same example in Fig. 7                 the case of CRS and KBS.
and Fig. 1. In conclusion, one position per 0.25 𝑐𝑚 is                   Based on the test scenarios, the proposed settings of
a sufficient sampling parameter. In this particular case,             proposed Alg. 2 are the following. Tuning the sampling
linear interpolation was suitable for the problem due to              interval 𝑑 to 0.25 𝑐𝑚 gives sufficiently accurate results;
the stretched curvatures in the movement. Nonetheless,                besides, more dense sampling has no significant effect.
other scenarios require more adaptive approaches like                 Due to a linear-like input scenario, the linear approach
CRS and KBS. Furthermore, the compared calibration                    performs the best in the exhibited use case, but the results
results in Fig. 8 on the right suggest that the translation           are similarly acceptable with CRS and KBS. Finally, trajec-



                                                                  6
Tekla Tóth et al. CEUR Workshop Proceedings                                                                             1–7



tory registration using CPD is the most effective choice.              pp. 8580–8586. doi:10.1109/ICRA40945.2020.
Applying this settings, the ℛℳ𝒮ℰ of the analyzed case                  9197316.
was reduced from 0.0110 (PPR, no interpolated trajec-              [3] L. Zhou, Z. Li, M. Kaess, Automatic extrinsic calibra-
tory) to 0.0053 𝑚 (CPD, 𝑑 = 0.25 𝑐𝑚, LI).                              tion of a camera and a 3d lidar using line and plane
   Synthetic tests provide the opportunity for direct com-             correspondences, 2018 IEEE/RSJ International Con-
parison of the estimated calibration parameters to ground              ference on Intelligent Robots and Systems (IROS)
truth data. Based on early-stage results, the main open                (2018) 5562–5569.
question is the best interpolation method for trajectory           [4] K. S. Arun, T. S. Huang, S. D. Blostein, Least-
estimation which largely depends on the specific sce-                  squares fitting of two 3-d point sets,           IEEE
nario. Paths with greater curvature give a better result               Transactions on Pattern Analysis and Machine In-
for the rigid transformation with CRS and KBS. During                  telligence PAMI-9 (1987) 698–700. doi:10.1109/
the further refinement of the paper, we will analyze in                TPAMI.1987.4767965.
more depth how the synthetized movement affects the                [5] P. J. Besl, N. D. McKay, A method for registration of
precision of the proposed methods.                                     3-d shapes., IEEE Trans. Pattern Anal. Mach. Intell.
                                                                       14 (1992) 239–256. URL: http://dblp.uni-trier.de/db/
                                                                       journals/pami/pami14.html#BeslM92.
4. Conclusions                                                     [6] Z. Zhang, Iterative point matching for registration
                                                                       of free-form curves and surfaces, Int. J. Comput.
This paper argued how trajectory registration could im-
                                                                       Vision 13 (1994) 119–152. URL: https://doi.org/10.
prove extrinsic sensor calibration algorithms if we can
                                                                       1007/BF01427149. doi:10.1007/BF01427149.
extract 3D positions from the input data, e.g., based on a
                                                                   [7] A. Myronenko, X. Song, Point set registration: Co-
specific target object. The novelty of the method is the
                                                                       herent point drift, IEEE Transactions on Pattern
controlled path of the object and the generated dense
                                                                       Analysis and Machine Intelligence 32 (2010) 2262–
point set by spline fitting and interpolation followed by
                                                                       2275. doi:10.1109/TPAMI.2010.46.
point cloud registration. The algorithm guarantees sam-
                                                                   [8] R. Knothe, S. Romdhani, T. Vetter, Combining pca
ple continuity and speeds up the measurement, as one
                                                                       and lfa for surface reconstruction from a sparse set
video sequence is sufficient. The method handles the
                                                                       of control points, in: Proceedings of the 7th Interna-
noise in the input images or the LiDAR scans indepen-
                                                                       tional Conference on Automatic Face and Gesture
dently owing to the point cloud-based approach which
                                                                       Recognition, FGR ’06, IEEE Computer Society, USA,
makes it more robust to the outliers.
                                                                       2006, p. 637–644. URL: https://doi.org/10.1109/FGR.
   There are some open questions regarding to the ob-
                                                                       2006.31. doi:10.1109/FGR.2006.31.
ject path and the setup of the measurements. We would
                                                                   [9] E. E. Catmull, R. Rom, A class of local interpolating
like to validate the better accuracy of the calibration via
                                                                       splines, Computer Aided Geometric Design (1974)
sythetic tests and analyze whether any ideal trajectory
                                                                       317–326.
exist which makes the results more accurate. Moreover,
                                                                  [10] C. Yuksel, S. Schaefer, J. Keyser, On the parameteri-
we examine the effect of the noise in the 3D positions.
                                                                       zation of catmull-rom curves, in: 2009 SIAM/ACM
Besides the spherical setup, chessboard-based approach
                                                                       Joint Conference on Geometric and Physical Mod-
could be also analyzed where an additional problem is
                                                                       eling, SPM ’09, Association for Computing Machin-
the interpolation of the rotation, as not only one position
                                                                       ery, New York, NY, USA, 2009, p. 47–53. URL: https:
but the four corner points can be detected in every frame.
                                                                       //doi.org/10.1145/1629255.1629262. doi:10.1145/
In the future, another application of the project can be
                                                                       1629255.1629262.
the trajectory estimation of autonomous vehicles using
                                                                  [11] D. H. U. Kochanek, R. H. Bartels, Interpolating
multi-sensor systems.
                                                                       splines with local tension, continuity, and bias
                                                                       control, SIGGRAPH Comput. Graph. 18 (1984)
References                                                             33–41. URL: https://doi.org/10.1145/964965.808575.
                                                                       doi:10.1145/964965.808575.
 [1] A. Hartey, A. Zisserman, Multiple view geometry              [12] T. Tóth., L. Hajder., Robust fitting of geomet-
     in computer vision (2. ed.), Cambridge University                 ric primitives on lidar data, in: Proceedings of
     Press, 2004. doi:10.1017/CBO9780511811685.                        the 14th International Joint Conference on Com-
 [2] T. Tóth, Z. Pusztai, L. Hajder, Automatic lidar-                  puter Vision, Imaging and Computer Graphics
     camera calibration of extrinsic parameters using                  Theory and Applications - Volume 5: VISAPP„
     a spherical target, in: 2020 IEEE International Con-              INSTICC, SciTePress, 2019, pp. 622–629. doi:10.
     ference on Robotics and Automation, ICRA 2020,                    5220/0007572606220629.
     Paris, France, May 31 - August 31, 2020, IEEE, 2020,




                                                              7