=Paper=
{{Paper
|id=None
|storemode=property
|title=High Performance GPU-Based Preprocessing for Time-of-Flight Imaging in Medical Applications
|pdfUrl=https://ceur-ws.org/Vol-715/bvm2011_67.pdf
|volume=Vol-715
}}
==High Performance GPU-Based Preprocessing for Time-of-Flight Imaging in Medical Applications==
High Performance GPU-Based Preprocessing for
Time-of-Flight Imaging in Medical Applications
Jakob Wasza1 , Sebastian Bauer1 , Joachim Hornegger1,2
1
Pattern Recognition Lab, Friedrich-Alexander University Erlangen-Nuremberg
2
Erlangen Graduate School in Advanced Optical Technologies (SAOT)
jakob.wasza@informatik.uni-erlangen.de
Abstract. Time-of-Flight (ToF) imaging is a promising technology for
real-time metric surface acquisition and has recently been proposed for a
variety of medical applications. However, due to limitations of the sensor,
range data from ToF cameras are subject to noise and contain invalid
outliers. In this paper, we discuss a real-time capable framework for ToF
preprocessing in a medical environment. The contribution of this work is
threefold. First, we address the restoration of invalid measurements that
typically occur with specular reflections on wet organ surfaces. Second,
we compare the conventional bilateral filter with the recently introduced
concept of guided image filtering for edge preserving de-noising. Third,
we have implemented the pipeline on the graphics processing unit (GPU),
enabling high-quality preprocessing in real-time. In experiments, the
framework achieved a depth accuracy of 0.8 mm (1.4 mm) on synthetic
(real) data, at a total runtime of 40 ms.
1 Introduction
Recent advances in Time-of-Flight (ToF) imaging have opened new perspectives
for its use in medical engineering. In particular, the resolution (40k points),
frame rate (40 Hz) and validity information provided by the camera hold po-
tential for medical applications. ToF imaging has, among others, been proposed
for 3D endoscopy [1] and intra-operative organ surface registration [2]. The in-
herent high degree of accuracy for these tasks requires a preprocessing pipeline
to cope with the noisy and corrupted data obtained from the ToF sensor. Even
though a proper sensor calibration [3] can be used to eliminate systematic er-
rors, de-noising provides the fundamental basis to produce steady and reliable
surface data. At this, temporal averaging and edge preserving filters are com-
monly used [2, 3]. One issue that cannot be addressed by conventional filters is
the elimination of invalid depth values caused by specular reflections that lead
to saturated sensor elements. These effects often occur with ToF surface acqui-
sition of organs due to their shiny and wet surface. In this paper, we propose
to use a spectral domain method known from digital radiography to estimate
the depth information at invalid pixels. As low filter run times are a crucial
factor, we investigate the performance and robustness of the bilateral filter and
GPU-Based ToF Preprocessing 325
the recently introduced guided image filter for edge preserving de-noising. To ul-
timately achieve real-time capability, we implemented all filters on the graphics
processing unit (GPU).
2 Materials and Methods
2.1 Preprocessing pipeline
The preprocessing pipeline in this work consists of three modules that operate on
the distance information: (i) defect pixel interpolation, (ii) temporal averaging,
(iii) edge preserving de-noising.
Defect Pixel Interpolation. In order to correct invalid depth measurements,
we adopt a spectral domain method that was proposed by Aach and Metzler
for defect pixel interpolation in digital radiography [4]. The basic assumption
is that the observed corrupted signal g can be expressed as a multiplication of
the ideal signal f with a binary defect mask w given by the validity information
provided by the camera. This corresponds to a convolution (∗) in the frequency
domain
g =f ·w ≡ G=F ∗W (1)
where F ,G and W denote the spectra of f , g and w, respectively. The unknown
complex coefficients of F are then estimated by an iterative spectral deconvolu-
tion scheme and the restored ideal signal is obtained as
f (x) = g (x) + (1 − w (x)) · fb(x) (2)
where fb denotes the inverse Fourier transform of the estimated spectrum F .
Temporal Averaging. For a frame at time t we perform temporal de-noising
by computing the arithmetic mean of N successive frames gi
t
X
1
ft (x) = gi (x) (3)
N
i=t−N +1
We note that this filter can be implemented in a recursive manner, i.e.
1
ft (x) = (N · ft−1 (x) − gt−N (x) + gt (x)) (4)
N
This formulation provides an effective way to reduce GPU memory usage as not
all N frames have to be stored and accessed. As the evaluation of equation
(4) per pixel x can be executed in parallel, an implementation on the GPU is
straightforward.
326 Wasza, Bauer & Hornegger
Edge Preserving De-Noising. The bilateral filter proposed by Tomasi and
Manduchi [5] is a very popular spatial de-noising filter in ToF imaging. The
discrete version for the Gaussian case can be expressed as
1 X kx − yk2 |g(x) − g(y)|
f (x) = g(y) exp − exp − (5)
Kx y ∈ ω σs2 σr2
x
where g denotes the input image and ωx denotes a local window centered at
coordinate x. Kx is a normalization factor, σs and σr control the spatial and
range similarity, respectively. Due to its translational-variant kernel this filter
is computationally expensive. Nevertheless, it can be implemented efficiently
on the GPU as the evaluation of equation (5) for all pixels x can be performed
concurrently. In order to cope with boundary conditions and potentially non-
coalesced GPU memory access patterns the input image g is bound as a texture.
Recently, the guided image filter was proposed by He et al. [6]. This filter has
a non-approximative linear-time algorithm for edge preserving filtering, thus,
being very promising for real-time ToF preprocessing. The filter output for a
pixel x can eventually be deduced as
!
1 X 1 X
f (x) = ay I(x) + by (6)
|ωx | y ∈ ω |ωx | y ∈ ω
x x
where ωx denotes a local window centered at x and I denotes the guidance
image. The evaluation of the summations in equation (6) and the estimation
of the coefficients ay and by [6] can be done by using box filters. In turn, box
filtering can be performed efficiently by using integral images which provides the
basis for a linear-time algorithm. We employ the parallel-prefix-sum algorithm
as described in [7] for the computation of integral images on the GPU.
2.2 Experiments
In order to assess the accuracy of the presented methods we evaluate the absolute
distance error between a ground truth and a template object on a per-pixel basis.
Synthetic Data. Deciding on a ground truth for the evaluation is a non-trivial
task as the ideal metric surface of the observed object is in general not known.
Therefore, we conduct experiments on simulated distance values reconstructed
from the z-buffer representation of a 3D scene. These values constitute the un-
biased ground truth in this experiment. We then approximate the temporal
noise on a per-pixel basis by adding an individual offset drawn from a normal
distribution with σ = 10 mm and µ = 0 mm. This standard deviation is mo-
tivated by observations on real ToF data. In order to simulate the effect of
amplitude related noise variance and total reflections we additionally corrupt
the distance data by Perlin noise [8]. For this study, we rendered a porcine liver
mesh segmented from a CT scan.
GPU-Based ToF Preprocessing 327
Table 1. Error analysis on synthetic data and filter run times.
Filter None DPI TA BF GF
Mean error [mm] 6.1 ± 8.2 5.5 ± 5.1 1.8 ± 1.5 0.4 ± 0.6 0.8 ± 2.1
Runtime [ms] n/a 34 ± 6.3 0.7 ± 0.1 2.9 ± 0.1 2.6 ± 0.2
Real Data. The focus of the real data experiments is the comparison of a
common preprocessing pipeline (temporal averaging, edge preserving de-noising)
with a pipeline that additionally performs defect pixel interpolation. We con-
ducted the experiment using a PMD CamCube 3.0 with a resolution of 200×200
pixels. For the ground truth generation, we averaged 1000 frames acquired from
an uncorrupted liver phantom and applied a bilateral filter with σs = 50 mm and
σr = 5 pixels to smooth out systematic artifacts that could not be corrected by
sensor calibration. We then placed two pieces of aluminum foil onto the phantom
to simulate a reflective surface. Using this corrupted phantom, we assessed the
results of the standard preprocessing pipeline as applied for the ground truth
and the same pipeline performing a defect pixel interpolation as a first step.
3 Results
We have implemented the defect pixel interpolation (DPI), temporal averag-
ing (TA), bilateral filter (BF) and guided image filter (GF) on a Quadro FX
2800M GPU using NVidia’s CUDA technology. Qualitative results for the de-
fect pixel interpolation on real data are depicted in Fig. 1. Without correction
the corrupted regions show a mean error of 68.4 ± 40.2 mm. Using defect pixel
interpolation we were able to reduce this error to 1.4 ± 1.1 mm. Qualitative and
quantitative results for the filter evaluation on synthetic data are shown in Fig. 2
and Table 1 whereby the error reduction is cumulative across filters while run
times are not. Using the presented preprocessing pipeline we were able to reduce
the mean error across the surface from 6.1±8.2 mm to 0.4±0.6 mm and 0.8±2.1
mm for the bilateral and the guided image filter, respectively. The bilateral filter
shows a better accuracy at very strong edges (Fig. 2). Given a total pipeline
runtime of approximately 40 ms, real-time constraints are satisfied.
Standard Error map DPI Error map
Fig. 1. Defect pixel interpolation on real data.
328 Wasza, Bauer & Hornegger
Fig. 2. Filter operations on synthetic data.
ToF data
Error map
Details
Raw data DPI TA BF GF
4 Discussion
We have presented a real-time capable preprocessing pipeline for ToF imaging
in medical applications. The defect pixel interpolation yielded promising re-
sults with regard to accuracy. Nonetheless, future research has to investigate
alternative spectral domain methods as well as spatial domain methods for the
restoration of invalid depth values. For edge preserving de-noising, guided image
filtering turned out to be an alternative to the bilateral filter. However, the latter
shows a better behavior at sharp edges. Future work has to investigate adaptive
variants of edge preserving filters that additionally account for the amplitude re-
lated noise variance. Concerning runtime issues, we note that even on a current
mid-range GPU real-time constraints can be satisfied. This is a promising result
with regard to next-generation and potentially high-resolution ToF sensors.
References
1. Penne J, Höller K, Stürmer M, et al. Time-of-flight 3D endoscopy. Lect Notes
Computer Sci. 2009;5761:467–74.
2. Seitel A, Santos T, Mersmann S, et al. Time-of-Flight Kameras für die intraoperative
Oberflächenerfassung. Proc BVM. 2010; p. 11–5.
3. Lindner M, Schiller I, Kolb A, et al. Time-of-Flight sensor calibration for accurate
range sensing. Comput Vis Image Underst. 2010; p. in press.
4. Aach T, Metzler V. Defect interpolation in digital radiography - how object-oriented
transform coding helps. Proc SPIE. 2001;4322:824–35.
5. Tomasi C, Manduchi R. Bilateral filtering for gray and color images. Proc ICCV.
1998; p. 839–46.
6. He K, Sun J, Tang X. Guided image filtering. Lect Notes Computer Sci. 2010;6311:1–
14.
7. Harris M, Sengupta S, Owens JD. Parallel prefix sum (scan) with CUDA. In:
Nguyen H, editor. GPU Gems 3. Addison Wesley; 2007. p. 1–18.
8. Perlin K. An image synthesizer. SIGGRAPH Comput Graph. 1985;19(3):287–296.