<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Ralf Bruder</string-name>
          <email>bruder@rob.uni-luebeck.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Florian Griese</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Floris Ernst</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Achim Schweikard</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Institute for Robotics and Cognitive Systems, University Lu ̈beck</institution>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Univerity Lu ̈beck</institution>
        </aff>
      </contrib-group>
      <fpage>179</fpage>
      <lpage>183</lpage>
      <abstract>
        <p>Real-time target localization in ultrasound is useful in many clinical and scientific areas. For example in radiation therapy tumors can be localized in real-time and irradiated with a high accuracy. To measure the position of an ultrasound target in a global coordinate system or to extend the tracking volume by moving the ultrasound transducer an optical marker is attached to it and observed by an optical tracking system. The necessary calibration matrices from marker to ultrasound volume are obtained using hand-eye calibration algorithms which take sets of corresponding observations of the optical marker and an ultrasound target as input. The quality of these calibration matrices is highly dependent on the measured observations. While the accuracy of optical tracking systems is very high, accurate tracking in ultrasound is difficult because of the low resolution of the ultrasound volume, artifacts and noise. Therefore accurate hand-eye calibration is difficult between ultrasound and optical tracking systems. We have tested different phantoms, matching- and sub-pixel strategies to provide highly accurate tracking results in 3D ultrasound volumes as basis for hand-eye calibration. Tests have shown that - using the described methods - calibration results with RMS errors of less than 1mm between observed and calibrated targets can be reached.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        With the recent enhancements in imaging quality and speed three-dimensional
ultrasound has become attractive for automatic guidance during robotized
interventions and high-accuracy target tracking applications. We use a modified
GE Vivid 7 dimension 3D cardiovascular ultrasound station for real-time volume
processing and direct target localization in radio-surgery [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. This station is
capable of providing ultrasound volume scans of the target region with more than
20 frames per second. A framework was established to run image-processing
algorithms directly on the ultrasound machine in order to track targets inside
the ultrasound volume.
      </p>
      <p>
        To map the obtained target positions to world coordinates the ultrasound
transducer is itself tracked using an attached optical marker and an accuTrack
250 (atracsys LLC, Bottens, CH) optical tracking system to locate the marker.
The position of the ultrasound target in world coordinates can be calculated
using both tracking results and the static translation between ultrasound
transducer and optical marker. Among other strategies the best results for this
translation are usually obtained [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] using a hand-eye calibration algoritm [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] and the
calibration setup shown in figure 1. The equation Ti B Ui = C is solved for
the unknown calibration matrix B by capturing multiple sets of corresponding
optical and ultrasound positions Ti and Ui for different transducer positions.
      </p>
      <p>One problem of hand-eye calibration is that the quality of the measured
position sets highly influences the calibration result. While the measurements of
optical tracking systems are highly accurate, tracking in ultrasound is difficult
because of the low resolution of the volume data, artifacts and noise. To obtain
an accurate calibration this tracking has to be optimized.
2</p>
    </sec>
    <sec id="sec-2">
      <title>Materials and Methods</title>
      <p>A calibration setup is used (Fig. 1). An application has been developed to
capture both ultrasound and optical tracking results. To obtain corresponding
datasets from both systems the capture time of both measurements is
calculated using high-accuracy timestamps and an estimate for the system latencies.
Then the optical tracking result is chosen to meet the capture time of the
corresponding ultrasound volume. For an intuitive handling during hand-held sample
acquisition an automatic capture trigger algorithm has been implemented
besides the manually triggered capture process. This algorithm triggers in phases
with minimal movements in both tracking results. In this way motion artifacts
in the measured tracking results are further minimized. After completion of a
predefined sample count the result of the hand-eye calibration is calculated and
the distance error between the calibration and the measured points is computed.
Depending on these values an iterative post-processing algorithm identifies up to
ten percent of the samples with high distances, eliminates these samples and
recalculates the tracking result. In this way errors such as wrong target detections
in ultrasound are eliminated.</p>
      <p>Two types of phantoms are used for tracking in ultrasound: A single
target (Figure 1a) is used to maximize tracking accuracy while lateron a complex
phantom (Fig. 1b) with multiple features is uses to increase targeting accuracy
and reduce the amount of necessary count of measured transducer positions.
2.1</p>
      <sec id="sec-2-1">
        <title>Tracking a single target</title>
        <p>A lead bulb on a nylon wire in a liquid tank is used as target. The lead bulb
reflects a high amount of ultrasound and can be easily detected in the ultrasound
volume using a maximum intensity search. As our ultrasound volume has a
worst-case resolution of more than 1.5mm per pixel this method is not sufficient
for high-accuracy tracking. To overcome this limitation we use cubic splines to
interpolate the target region around a detected position with maximum intensity.
The extreme value of the interpolated volume is used as sub-pixel approximation
of the lead bulb position. Another tracking possibility is template matching.
Using a predefined pattern of the lead bulb, the sum of square distances (SSD)
is calculated to find an optimal match between the pattern and the lead bulb.
With sub-pixel enhancement through iterative interpolation of the target region
a high accuracy tracking result can be optained.</p>
        <p>Figure 3b shows the different tracking positions for a two-dimensional slice
of the target. To test the sub-pixel quality of each strategy a lead bulb is moved
through a liquid tank on a linear trajectory while being continuously tracked
(Figure 3c). While the tracking accuracy is increased with both strategies,
template matching shows the best linearity and highest accuracy.
2.2</p>
      </sec>
      <sec id="sec-2-2">
        <title>Tracking multiple targets</title>
        <p>Three-dimensional ultrasound offers the possibility to track multiple targets
inside one volume simultanously. We use a complex wire phantom (Dansk Fantom</p>
        <p>Service, Fig. 4a) to find a defined geometry of twisted nylon wires inside the
volume. The wires are clearly visible in the ultrasound volume and can be easily
tracked using a repeated maximum intensity search with an additional distance
criterium, so that features in close proximity to already located points are not
detected. Alternatively template matching can be used with multiple, predefined
patterns for wires in different angles to the beam.</p>
        <p>
          The obtained feature positions are matched against a predefined virtual
phantom or the first captured phantom using the ICP algorithm [
          <xref ref-type="bibr" rid="ref4">4</xref>
          ] which may be
exchanged replaced by RANSAC [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ] to eliminate erroneous feature detections.
The resulting transformation matrix between virtual and detected feature cloud
is used as tracking result Ui. Figure 4 shows the alignment results.
(a)
(b)
(c)
Fig. 4. The complex wire phantom in ultrasound is shown in (a). In (b) the extracted
feature cloud (green) is registered to the predefined phantom (blue). While RANSAC
(black) works as expected the ICP result (red) is rotated due to false feature detections.
Lead bulb, standard resolution
Lead buld, sub-pixel enhanced
Complex phantom, standard resolution
Complex phantom, sub-pixel enhanced
        </p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Results</title>
      <p>To measure the calibration quality we compute the RMS distance error between
all calibrated and measured target points. Table 1 shows this quality value for
the different calibration scenarios. For the single target an automatic calibration
with 100 captured points is performed without manual interaction.
Measurements at three different positions with 50 located targets in each volume are
performed for the complex phantom. Both calibration methods show
acceptable results with RMS errors of less than 2mm with standard resolution. With
high-accuracy ultrasound target localization both methods can be significantly
enhanced. For single targets the RMS errors can be reduced to less than 1mm
using only half of captured points. As ICP is very sensitive to position errors, the
accuracy gain for complex phantom calibration due to better target localization
is even higher.
4</p>
    </sec>
    <sec id="sec-4">
      <title>Conclusion References</title>
      <p>Two calibration methods are implemented and enhanced with sub-pixel target
localization. Especially with complex phantoms a stable calibration can be
performed in only three steps which makes the method attractive for common use.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Bruder</surname>
            <given-names>R</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ernst</surname>
            <given-names>F</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schlaefer</surname>
            <given-names>A</given-names>
          </string-name>
          , et al.
          <article-title>Real-time tracking of the pulmonary veins in 3D ultrasound of the beating heart</article-title>
          .
          <source>In: 51st Annual Meeting of the AAPM</source>
          . vol.
          <volume>36</volume>
          of Med Phys;
          <year>2009</year>
          . p.
          <fpage>2804</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Poon</surname>
            <given-names>TC</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rohling</surname>
            <given-names>RN</given-names>
          </string-name>
          .
          <article-title>Comparison of calibration methods for spatial tracking of a 3-D ultrasound probe</article-title>
          .
          <source>Eur J Ultrasound</source>
          .
          <year>2005</year>
          ;
          <volume>31</volume>
          (
          <issue>8</issue>
          ):
          <fpage>1095</fpage>
          -
          <lpage>108</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Tsai</surname>
            <given-names>RY</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lenzr</surname>
            <given-names>RK</given-names>
          </string-name>
          .
          <article-title>A new technique for fully autonomous and efficient 3D robotics hand/eye calibration</article-title>
          .
          <source>IEEE Trans Rob Autom</source>
          .
          <year>1989</year>
          ;
          <volume>5</volume>
          (
          <issue>3</issue>
          ):
          <fpage>345</fpage>
          -
          <lpage>58</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Rusinkiewicz</surname>
            <given-names>S</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Levoy</surname>
            <given-names>M.</given-names>
          </string-name>
          <article-title>Efficient variants of the ICP algorithm</article-title>
          .
          <source>In: Proc Int Conf 3D Digit Imaging Model; 2001</source>
          . p.
          <fpage>145</fpage>
          -
          <lpage>52</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Chen</surname>
            <given-names>C</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hung</surname>
            <given-names>Y</given-names>
          </string-name>
          , Cheng J.
          <string-name>
            <surname>RANSAC-based</surname>
            <given-names>DARCES</given-names>
          </string-name>
          :
          <article-title>a new approach to fast automatic registration of partially overlapping range images</article-title>
          .
          <source>IEEE Trans Pattern Anal Mach Intell</source>
          .
          <year>1999</year>
          ;
          <volume>21</volume>
          :
          <fpage>1229</fpage>
          -
          <lpage>34</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>