Next Article in Journal
Two-Stage Configuration of User-Side Hybrid Energy Storage Based on Fuzzy Optimization
Next Article in Special Issue
High-Frequency Deep Sclerotomy, A Minimal Invasive Ab Interno Glaucoma Procedure Combined with Cataract Surgery: Physical Properties and Clinical Outcome
Previous Article in Journal
Adaptive Linear Neural Network Approach for Three-Phase Four-Wire Active Power Filtering under Non-Ideal Grid and Unbalanced Load Scenarios
Previous Article in Special Issue
3D Soft-Tissue Prediction Methodologies for Orthognathic Surgery—A Literature Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Tip Estimation Method in Phantoms for Curved Needle Using 2D Transverse Ultrasound Images

1
School of Mechanical Engineering and Automation, Harbin Institute of Technology, Shenzhen 518055, China
2
Robotics, Perception and Artificial Intelligence Lab, The Chinese University of Hong Kong, N.T., Hong Kong, China
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2019, 9(24), 5305; https://doi.org/10.3390/app9245305
Submission received: 8 October 2019 / Revised: 21 November 2019 / Accepted: 27 November 2019 / Published: 5 December 2019
(This article belongs to the Special Issue Engineering for Surgery)

Abstract

:
Flexible needles have been widely used in minimally invasive surgeries, especially in percutaneous interventions. Among the interventions, tip position of the curved needle is very important, since it directly affects the success of the surgeries. In this paper, we present a method to estimate the tip position of a long-curved needle by using 2D transverse ultrasound images from a robotic ultrasound system. Ultrasound is first used to detect the cross section of long-flexible needle. A new imaging approach is proposed based on the selection of numbers of pixels with a higher gray level, which can directly remove the lower gray level to highlight the needle. After that, the needle shape tracking method is proposed by combining the image processing with the Kalman filter by using 3D needle positions, which develop a robust needle tracking procedure from 1 mm to 8 mm scan intervals. Shape reconstruction is then achieved using the curve fitting method. Finally, the needle tip position is estimated based on the curve fitting result. Experimental results showed that the estimation error of tip position is less than 1 mm within 4 mm scan intervals. The advantage of the proposed method is that the shape and tip position can be estimated through scanning the needle’s cross sections at intervals along the direction of needle insertion without detecting the tip.

1. Introduction

With the help of the beveled-tip needle, percutaneous interventions and therapies have been widely involved in current clinical procedures, such as brachytherapy [1,2], tissue biopsy [3,4], and drug delivery [5,6]. In intervention procedures, less needle misplacement will lead to a more reliable treatment and a more accurate medical practice. According to the clinical studies [7,8], the needle is easy to be deflected, which will cause needle tip misplacement and may lead to unsafe procedures. Due to the needle-tissue interaction, improper insertion force or physiological motions, such as breathing may cause targets or obstacles to be unstable, which will lead to an unexpected error. To address the challenge, real-time feedback is highly required. Usually, medical imaging devices are used, such as ultrasound (US) [9], computerized tomography (CT) [10,11], or magnetic resonance imaging (MRI) [2,12]. Generally, the image-guided percutaneous interventions are conducted with the use of CT or MRI. However, ultrasound-guided procedures are more attractive due to their advantages such as none ionizing radiations and real-time detection.
Many studies for the guidance of the needle during the insertion operation have been conducted with the help of ultrasound devices, and 2D ultrasound images are quite general to use, especially for the sagittal one (shown in Figure 1). Elif et al. proposed to use circular Hough transform to locate the needle tip accurately, even when the imaging is out-of-plane [13]. Kaya et al. localized the needle axis and estimated the needle tip by using a Gabor Filter in sagittal US images [14]. To execute in real-time, they improved the processing time by applying the bin packing method [15]. Recently, a template-based tracking method with the efficient second-order minimization optimization method has been used to track the needle [16]. In recent studies, more and more novel ideas have been used to locate the needle and evaluate its tip to sagittal US images, such as the use of signal attenuation maps [17], convolution neural networks (CNN) [18], and maximum likelihood estimation sample consensus (MLESAC) method [19]. However, a demerit of using sagittal US images is that out-of-plane bending of the needle cannot be detected. Therefore, the methods applied on sagittal US images are not suitable for the needle which may be deflected by the inevitable factors, especially for long needles.
An alternate solution for this problem is to use a 3D US image, which has been widely studied in recent researches. Yue et al. used a RANSAC method to detect the needle in a 3D US situation and Kalman filter has been used to reduce the error [20]. Chatelain et al. used the particle filtering to track a robot-guided flexible needle by using 3D US [21]. In addition, a convolutional neural network with conventional image processing techniques has also been used to track and detect the needle [22] and a naive Bayesian classifier was used to localize the needle among 3D US volume voxels [23]. However, the large 3D US volumetric dataset would make it difficult to obtain and process the online data.
Due to the above disadvantages, sagittal US images and 3D US volume are not suitable for a long flexible needle. To locate the needle accurately, methods that use transverse US images (shown in Figure 1) have been used successfully in some studies. For example, Vrooijink et al. [24] present a method to track the flexible needle during the insertion into a gelatin tissue by using 2D US images perpendicular to its needle tip. However, the background is pure without noise, which makes it impractical. Waine et al. [25,26,27] focus on the research about the needle insertion, in permanent prostate brachytherapy (PPB), where needles are typically 200 mm and easily to be deflected, indicating the fact that the rectum limits the movement of US probe. As a result, it is hard to acquire the sagittal images of the curved needle to observe the deflection during needle insertion. This is because the sagittal method has a strong relationship to the movement of the US probe when the needle is out of the view-field of the US images, and this movement maybe deforms the prostate as well as affects the needle target. For a deflected needle, the transverse US image is a better choice for its detection. Compared to the sagittal images, the transverse US images are easy to be acquired when the US probe scanning along the needle, no matter how much the needle is curved.
In this paper, we present a method to track a long-curved needle from the 2D transverse US images and estimate its tip for the guidance of needle insertion. Ultrasound is first used to detect the cross-sections of the long-flexible needle (shown in Figure 2 STEP 1). The needle shape tracking method combined needle detection with Kalman filter develops an accurate location and a robust tracking procedure with scan intervals from 1 mm to 8 mm (shown in Figure 2 STEP 2). Unlike the previous study [26], the 3D needle positions obtained from 2D US images and optical tracking systems have been used in KF for the precise location. The curve fitting method is then used to achieve the shape reconstruction and the needle tip position is estimated based on its length and the curve fitting result (shown in Figure 2 STEP 3). The advantage of the proposed method is that the shape and tip position can be estimated through scanning the needle’s cross-sections at intervals along the direction of needle insertion without detecting the tip. Besides, a novel histogram method is introduced to detect the needle in image processing, which can improve the needle localization under the effect of needle comet tail and the poor reflection, despite of the abrupt intensity changes. In addition, a robotic ultrasound system (RUS) [28] is built to evaluate the proposed needle tip estimation method. Results showed that the estimation of the tip position is less than 1 mm with 4 mm scan intervals.
The rest of this paper is organized as follows. The proposed methods will be introduced in detail in Section 2. Section 3 intends to represent the experimental setup and the results. Finally, the discussion and conclusions are detailed in Section 4.

2. Materials and Methods

The proposed needle tip estimation method in successive transverse US images can be divided into three stages: needle detection, needle shape tracking, and tip estimation. The processing diagram are shown in the Figure 3. The needle location will be manually selected as an initial region of interest (ROI) by the binary method in the first US image. After that, the prediction of needle position from a Kalman filter can be transformed in the transverse US images. At the same time, the next needle position in this ROI can be found through the histogram method. The KF is then updated for the current precise needle position and prediction of the next position. After all the cross-sections of the needle have been collected, the needle shape can be fitted by the curve fitting method (part C in Figure 3). Finally, the position of the needle tip can be estimated from the curve fitting result based on the length of the needle. In this study, the ROI is set as a square window with a length of three times larger than the needle diameter and its center represents the needle position. Needle detection (part B in Figure 3) is mainly about image processing, while the Kalman filter is used for needle shape tracking (part A in Figure 3).

2.1. Needle Detection

Needle detection is mainly for identifying the cross-section of the needle in the US images by using the binary method and the histogram method. As the ultrasound is quite sensitive to the metal, the needle-inside area can be brighter than others. A binary method [26] is first used to select ROI in the first image and then a histogram method is used to locate the needle despite the US shadows and poor reflection.
Binary method intends to strengthen the contrast of brightness to highlight the brighter area to select them. This method is used for the initialization which supposes to find the candidates in the first image. It includes intensity normalization, background reinforcement, and brightness enhancement. The center of the area is the location of the needle. During the experiment, a histogram method is proposed to find the needle accurately. The histogram method contains intensity normalization and background reinforcement. The histogram method tends to find an area of high-intensity pixels, which intends to find the upper face of the needle and then locate the needle based on the diameter. The background reinforcement part can be described as follows:
min I t s . t . I t 255 n I j M δ I t [ 0 , 255 ]
where I j is the gray level of the pixel and n I j is the number of pixels which have the gray level of I j . M is the size of ROI, and δ is the manually selected parameter to limit the bright pixels. In this work, M is a square of 45 × 45 , and δ is set to 0.08 based on empirical tests.
There are two unexpected situations that may affect the position accuracy, namely the comet tail and the poor reflection. The comet tail will affect the size of the needle area and usually lead to a larger area than the actual size (Figure 4a). On the contrary, the poor reflection makes the needle area look much smaller in the image (Figure 4b). Therefore, the accurate location should be intended to eliminate the effect of shadows and poor reflection. In Figure 4, there are two examples which are used the two methods relatively. As the example shown in Figure 4a, the noise (yellow circle in the histogram of ROI) may probably be concerned as the needle while the needle is just concerned about a few pixels (red circle in histogram of ROI). The two methods can both filter the noise and locate the needle correctly. However, in ROI configuration (shown in Figure 3), the histogram method intends to find more candidates than the binary method, since the former focuses on the higher intensity pixels while the latter focuses on the area and intensity. Therefore, the binary method is more feasible in ROI configuration.
However, in the case of poor reflection, the needle displays a little in the image, and the area of the needle is smaller than the expected. Because the needle would reflect as long as the image gain is high enough or the sound power is big enough, it would reveal apparently compared to its surroundings. An example of the histogram of ROI is shown in Figure 4b, the red circle represents the upper surface of the needle. Moreover, the ROI square is darker than the one in Figure 4a, while the settings of the ultrasound are the same in Figure 4. From the figure, the histogram method seems to be more accurate than the binary method. In fact, the error of two methods, in this case, is 0.34 mm with 1.2 mm diameter of the needle. It is not that obvious to judge the accuracy. In this study, we use the histogram method during the experiment.

2.2. Needle Shape Tracking

As indicated in previous researches [20,29,30], Kalman filter has been successfully used for tracking needle in the successive ultrasound images. In this study, the Kalman filter is used to improve the estimation of the needle location in successive frames. As shown in Figure 5, the applied Kalman filter has two processes, prediction and update. The prediction stage intends to locate the needle position previously and set the ROI (red and yellow square in Figure 5) to find the needle precisely with a small window, which is supposed to reduce the computation. The update stage is the result of the needle position after the measurement from the histogram method.
The state prediction t ^ i intends to represent the prediction state of the transverse needle center position ( x , y , z ) in the reference frame with the change of the needle position ( x , y , z ) at sample i according to the state t. ( x , y , z ) are the difference between the previous needle position and the current needle position. t i is the result from the previous iteration, which is as follows:
t i = x i y i z i x i y i z i
where x 1 and y 1 are set to be 0, while z 1 is equal to the scan interval. The prediction equations are as follows:
t ^ i = A t i 1 ,
P ^ i = A P i 1 A T + Q .
The measurement update equations are as follows:
K i = P ^ i H T ( H P ^ i H T + R ) 1 ,
t i = t ^ i + K i ( m i H t ^ i ) ,
P i = ( I K i H ) P ^ i ,
where A, H, R, and Q are as follows:
A = 1 0 0 1 0 0 0 1 0 0 1 0 0 0 1 0 0 1 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 ,
H = I 6 × 6 , R = Q = 10 6 × I 6 × 6 .
A is the state transition matrix, H is the measurement matrix. P ^ i and P i are the priori and posteriori estimate error covariance, and R and Q are the measurement error covariance and processing error covariance, respectively. K i is the Kalman gain at sample i. m i is the measurement state from the needle detection.
The 3D prediction position ( x p r e d i c t i o n , y p r e d i c t i o n , z p r e d i c t i o n ) is obtained from the previous state. Before needle detection in the current US image, the 3D prediction position should be transformed on the image plane frame as 2D position ( I x _ p r e d i c t i o n , I y _ p r e d i c t i o n ) . After the update, the needle position ( I x _ u p d a t e , I y _ u p d a t e ) in the image will be transformed into the 3D position ( x m , y m , z m ) . Meanwhile, we can get ( x m , y m , z m ) from:
( x m , y m , z m ) = ( x m , y m , z m ) ( x p r e v i o u s , y p r e v i o u s , z p r e v i o u s ) .
As a result, the measurement state m i in this frame is acquired as ( x m , y m , z m , x m , y m , z m ) . Through the measurement update, the current state can be obtained as ( x c , y c , z c , x c , y c , z c ) from Equation (6). The relationship between these transformations will be described in the next subsection. In the previous study [26], the data from the image has only two dimensions, lacking the data from the direction along the movement of the probe, which leads to an incomplete location. Moreover, the space information is more capable to locate the needle accurately than the plane information. Therefore, 3D positions have been used in the KF for the precise location. The KF in this work is not only used for filtering, but also for predicting the next needle position in the US image. The ROI for the next iteration is centered around the needle position of the Kalman Filtering prediction, which can help to remove the outliers from the ROI.

2.3. Tip Estimation

Before tip estimation, 2D points should be transformed into 3D points based on the relationship of each frame. The relationship among the reference frame, probe frame, marker frame, and image frame are specified in Figure 6.
As shown in Figure 6, the image has one plane with 2 axes (axis x and axis y) and axis z is vertical to the image. Moreover, the probe has the same frame with image, except that the probe frame is designed in millimeters and the image frame is set in pixels. Equation (9) implies the transformation from image to reference:
P o i n t 1 = T m a r k e r r e f × T p r o b e m a r k e r × T i m a g e p r o b e × P o i n t 2
where P o i n t 1 and P o i n t 2 are the points on the reference frame and image frame, respectively, T m a r k e r r e f is the transformation from the reference frame to the marker frame, T p r o b e m a r k e r is the transformation from the marker frame to the probe frame, T i m a g e p r o b e is the transformation from the probe frame to the image frame. Through this transformation, the needle position in the image can be directly re-defined in the reference frame for the needle tracking and curve fitting.
The tip estimation has two steps: curve fitting and tip estimation. In this study, the third-order curve line is used for the shape construction where the equations can be written as:
f ( x ) = k = 0 3 a k x k ,
g ( x ) = k = 0 3 b k x k ,
where f ( x ) and g ( x ) are the equations to fit the line along the x, which is the axis with the same direction of the insertion. ( a 0 , a 1 , a 2 , a 3 ) and ( b 0 , b 1 , b 2 , b 3 ) are the free parameters of the needle shape model.
After sample points of the inflected needle have been obtained, the least-square curve fitting method will be used to fit these points as a cubic line. The target functions to fit the cubic line can be defined as follows:
F ( a 0 , a 1 , a 2 , a 3 ) = min i = 1 n ( f ( x i ) y i ) 2 = min i = 1 n ( k = 0 3 a k x i k y i ) 2 ,
G ( b 0 , b 1 , b 2 , b 3 ) = min i = 1 n ( g ( x i ) z i ) 2 = min i = 1 n ( k = 0 3 b k x i k z i ) 2 ,
where n is the number of the points ( n 4 ) and ( x i , y i , z i ) is the position of point i. By applying the l 2 norm minimization in the two-dimensional Euclidean space, it can be formulated as:
arg min t R X a Y 2 ,
where a = [ a 0 , a 1 , a 2 , a 3 ] , Y = [ y 1 , y 2 , , y n ] and X can be written as:
X = 1 x 1 x 1 2 x 1 3 1 x 2 x 2 2 x 2 3 1 x n x n 2 x n 3 .
The solution can be estimated as follows:
a = ( X T X ) 1 X T Y .
From this solution, f ( x ) and g ( x ) can be obtained to construct needle shape. The tip position can then be estimated by the following optimum solutions based on the length of the needle:
min t a i l x t i p x 1 + f ( x ) 2 + g ( x ) 2 d x L 2 s . t . t i p x > t a i l x ,
where t a i l x is the measured value of the tail position from the optical tracker, t i p x is the expected value of the tip position in axis x, L is the length of the needle.

3. Results

3.1. Experimental Platform Setup

To verify the proposed tip tracking and shape sensing method, a robotic ultrasound system has been built, which includes a KUKA IIWA robot arm, a Wisonic ultrasound scanner, an NDI optical tracker, an NDI electromagnetic (EM) tracker, and a computer. As shown in Figure 7, the US probe is mounted on the effector of the robot arm by the gripper attached to the passive marker. The phantom or ex-vivo (like chicken breast in Figure 7) is punctured by an 18G beveled-tip needle with 200 mm long, while the needle tip is completely exposed for validation. The diameter of the needle is 15 pixels in the image. The NDI optical tracker is used to localize the marker bound with the probe, while NDI electromagnetic tracker is used to validate the tip position.
Experiments have been taken in a water tank, which provides a liquid environment for the ultrasound. And the needle is placed in water or inserted in the silica gel phantom (shown in Figure 8a), pork and chicken breast. The depth of the ultrasound is set to 4 cm. In this study, the needle is usually detected in 1 to 3 cm from the US probe. During the experiment, the robot arm automatically moves with the ultrasound probe along the direction of the needle insertion without any contact with the tissue (shown in Figure 8b). The whole scan length is at most 160 mm which depends on the scan intervals (shown in Table 1). The scan interval decreases with the increasing collected points.
Before data collection, the US image and the marker need to be calibrated. After that, the experiment starts after the needle finished puncturing manually. The robot arm is used to move the US probe scanning along the needle. Meanwhile, pose data are collected from the optical tracker and US images from the ultrasound scanner. Finally, the tail of the needle and its tip are measured by the optical and electromagnetic sensors, respectively, for the curve fitting and the tip validation. The needle is inserted manually, imitating the real situation of percutaneous intervention.

3.2. Tip Estimation

Four kinds of platforms have been used in the experiments: water, phantom, chicken, and pork. Each platform was tested several times. Figure 9 shows one test in chicken breast. In this case, the US probe moved along the needle in the chicken breast every 4 mm. The black square point on the left is the needle tail position and the blue line is the estimated needle shape. The green points are the detected needle and considered as the center of the needle, the blue point is the estimated tip position and the red point is measured tip position from EM. The estimation error is 0.69 mm in this test.
The error of the algorithm is shown in Table 2, which suggests that the errors increase with the increase of scan intervals. Figure 10 shows the results of the experiment on four platforms. The mean errors are all under 0.4 mm with a 1 mm scan interval in the four experiments, while the error is around 1 mm with an 8 mm interval.

4. Discussion and Conclusions

Needle insertion guided by ultrasound images is widely used for percutaneous interventions. However, the needle detection due to its deflection by the inevitable factors is challenging during the needle insertion. Such factors include needle-tissue interaction, improper insertion force, physiological motions, and so on. Automatic needle detection with needle tracking in 2D transverse US images could overcome these limitations and estimate needle tip through the curve-fitting method. The target of this study is to develop a robust needle detection and tracking method with the help of ultrasound images to estimate the needle tip precisely and accurately. We used a histogram method to detect the needle in transverse US images to decrease the effects of comet tail and poor reflection. In subsequent post-processing, the needle was tracked by the Kalman filter tracking method in consecutive US images with the help of the displacement of the probe. A third-order curve fitting method has been used to estimate the needle tip. When the probe is moved by the robot arm, the scanning time is different. We assume that the time when the probe stops to collect the data is the same. The less scan interval we choose, the more collection points we can obtain and the more scanning time it takes. Therefore, the scanning time mainly depends on the number of scanning points while the accuracy lies in how short the scan interval is. In other words, the accuracy of the tip estimation can be improved by reducing the scan interval to collect more needle positions. However, this will consume more scanning time and reduce the efficiency of tracking. Inversely, fewer collecting positions would cost less time but may lead to a more possibility of the failure of shape construction and a more possibility of large error of the tip estimation. As a result, how to balance precision and scanning time is very important to make the proposed method more efficient. In our experiment, it is found that a 4 mm scan interval has an error less than 1 mm, which is a better choice to satisfy both requirements.
In the proposed method, needle shape tracking has a great contribution to the accurate localization, since the needle can be tracked precisely by Kalman filter through its prediction and update. However, needle shape tracking is heavily dependent on the scan interval, especially for a large curved needle, since Kalman filter is generally well functioned in the lineal system. As a result, if the needle is deflected during insertion, the Kalman filter would make mistakes and wrongly predict the needle position when the scan interval is large. In this study, it is found that Kalman filter would fail if the scan interval is more than 8 mm. This may due to the impact on the prediction of KF with a large deviation. Moreover, the deviation will not be eliminated even with the change of the ROI size. However, the Histogram method showed the accurate and effective detection of the needle, but it relies on the brightness of the image as the needle could not be easily detected where there are plenty of pixels with the highest intensity (which has a max value of 255). However, this condition can be controlled by the setting of the ultrasound scanner to expand the gray level of the image properly.
The proposed method still has its limitations. During the experiment, time is needed for the data collection from the US scanner and optical tracking system, and the movement of the robot arm, which is affected by the scan length and scan interval. However, it is very hard to acquire the whole position of a long needle in one scan for any medical image sensor. Therefore, in the future, it is valuable to find a method to reduce the times of needle detection in order to reduce the time for the tip estimation. Moreover, when the robot arm moves with the US probe precisely, the tissue and needle have a possibility to be deformed by the probe motion. Hence, it is necessary to make the robot arm move smoothly as well as correctly on the surface of tissue. In addition, patient motion is the biggest uncertain problem, which leads to the failure of needle insertion and detection.
In the proposed system, we use the 2D US scanner for the detection of the needle in various kinds of tissue. However, the 3D US scanner can also be used in this system. Although it has volume data and the detection method is different, the tracking method is able to be similar, as we also use the 3D positions for tracking in this study. Moreover, time is also needed for data collection and the movement of the robot arm, especially for the long needle, which is easily out of view-field of US volumes or images. Therefore, we use 2D US images in this study.
In this paper, a method for tracking a long-curved needle from the 2D transverse US images and the tip estimation is represented and demonstrated with RUS. Ultrasound is first used to detect the cross section, with the probe moving along a long-flexible needle. Needle shape tracking method combined needle detection with Kalman filter by using 3D needle positions develops an accurate location and a robust tracking procedure. Needle shape is then constructed by using the curve fitting method and its tip position is estimated based on the former result. A histogram method is introduced to detect the needle in image processing, which can improve the needle localization despite the abrupt intensity changes. This new imaging approach is proposed based on the selection of numbers of pixels with a higher gray level, which can directly remove the lower gray level to highlight the needle. Results of the experiments suggest that the detection of the needle by the histogram method and Kalman Filter has high precision with minimum error 0.13 mm with a 1 mm scan interval in the phantom experiment and maximum error 1.35 mm with a 8 mm scan interval in the pork experiment. With the increase of the scan interval, the mean error would rise. Moreover, results showed that the estimation of the tip position is less than 1 mm within 4 mm scan intervals. We suggest choosing a 4 mm scan interval to balance the precision and scanning time to maximize efficiency. In the future, we would make the experiments of how long the scan length is the best length to estimate the needle tip. The proposed method would be a great assist to surgeons to locate the needle tip when they perform percutaneous insertion procedures with a long flexible needle, such as prostate brachytherapy.

Author Contributions

Conceptualization, M.Q.-H.M.; validation, Z.L.; writing—original draft, Z.L.; writing—review and editing, S.S. and L.L.

Funding

This research was funded in part by the National Natural Science Foundation of China, grant number 61803123, and in part by the National Key R&D Program of China, grant number 2018YFB1307700, and in part by the Natural Science Foundation of Guangdong Province, China, grant number 2018A030310565.

Acknowledgments

For this kind of study, no formal ethics approval is required by the institutional ethics com. Informed consent was obtained from all individual participants included in the study.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
KFKalman filter
USUltrasound
RUSRobot ultrasound system
CTComputed tomography
MRIMagnetic resonance imaging
RANSACRandom sample consensus

References

  1. Orlando, N.; Snir, J.; Barker, K.; Hoover, D.; Fenster, A. Power Doppler ultrasound imaging with mechanical perturbation for improved intraoperative needle tip identification during prostate brachytherapy: A phantom study. Proc. SPIE 2019, 1095131. [Google Scholar] [CrossRef]
  2. Henken, K.R.; Seevinck, P.R.; Dankelman, J.; van den Dobbelsteen, J.J. Manually controlled steerable needle for MRI-guided percutaneous interventions. Med. Biol. Eng. Comput. 2017. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Nachabé, R.; Hendriks, B.H.W.; Schierling, R.; Hales, J.; Racadio, J.M.; Rottenberg, S.; Ruers, T.J.M.; Babic, D.; Racadio, J.M. Real-time in vivo characterization of primary liver tumors with diffuse optical spectroscopy during percutaneous needle interventions: Feasibility study in woodchucks. Investig. Radiol. 2015, 50, 443–448. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Mehrjardi, M.Z.; Bagheri, S.M.; Darabi, M. Successful ultrasound-guided percutaneous embolization of renal pseudoaneurysm by autologous blood clot: Preliminary report of a new method. J. Clin. Ultrasound 2017, 45, 592–596. [Google Scholar] [CrossRef] [PubMed]
  5. Jun, H.; Ahn, M.H.; Choi, I.J.; Baek, S.K.; Park, J.H.; Choi, S.O. Immediate separation of microneedle tips from base array during skin insertion for instantaneous drug delivery. RSC Adv. 2018, 8, 17786–17796. [Google Scholar] [CrossRef] [Green Version]
  6. Park, H.; Kim, H.; Lee, S.J. Optimal Design of Needle Array for Effective Drug Delivery. Ann. Biomed. Eng. 2018, 46, 2012–2022. [Google Scholar] [CrossRef]
  7. Renfrew, M.; Griswold, M.; Çavuşoĝlu, M.C. Active localization and tracking of needle and target in robotic image-guided intervention systems. Auton. Robot. 2018, 42, 83–97. [Google Scholar] [CrossRef]
  8. Rossa, C.; Tavakoli, M. Issues in closed-loop needle steering. Control Eng. Pract. 2017, 62, 55–69. [Google Scholar] [CrossRef]
  9. Van de Berg, N.J.; Sánchez-Margallo, J.A.; van Dijke, A.P.; Langø, T.; van den Dobbelsteen, J.J. A Methodical Quantification of Needle Visibility and Echogenicity in Ultrasound Images. Ultrasound Med. Biol. 2019. [Google Scholar] [CrossRef] [Green Version]
  10. Li, R.; Xu, S.; Pritchard, W.F.; Karanian, J.W.; Krishnasamy, V.P.; Wood, B.J.; Tse, Z.T.H. AngleNav: MEMS Tracker to Facilitate CT-Guided Puncture. Ann. Biomed. Eng. 2018, 46, 452–463. [Google Scholar] [CrossRef]
  11. Shellikeri, S.; Setser, R.M.; Hwang, T.J.; Srinivasan, A.; Krishnamurthy, G.; Vatsky, S.; Girard, E.; Zhu, X.; Keller, M.S.; Cahill, A.M. Real-time fluoroscopic needle guidance in the interventional radiology suite using navigational software for percutaneous bone biopsies in children. Pediatr. Radiol. 2017, 47, 963–973. [Google Scholar] [CrossRef] [PubMed]
  12. Raj, S.D.; Agrons, M.M.; Woodtichartpreecha, P.; Kalambo, M.J.; Dogan, B.E.; Le-Petross, H.; Whitman, G.J. MRI-guided needle localization: Indications, tips, tricks, and review of the literature. Breast J. 2019, 479–483. [Google Scholar] [CrossRef] [PubMed]
  13. Ayvali, E.; Desai, J.P. Optical Flow-Based Tracking of Needles and Needle-Tip Localization Using Circular Hough Transform in Ultrasound Images. Ann. Biomed. Eng. 2015, 43, 1828–1840. [Google Scholar] [CrossRef] [PubMed]
  14. Kaya, M.; Bebek, O. Needle Localization Using Gabor Filtering in 2D Ultrasound Images. In Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, 29 September 2014; pp. 4881–4886. [Google Scholar] [CrossRef]
  15. Kaya, M.; Senel, E.; Ahmad, A.; Orhan, O.; Bebek, O. Real-time needle tip localization in 2D ultrasound images for robotic biopsies. In Proceedings of the IEEE International Conference on Robotics and Automation, Istanbul, Turkey, 27–31 July 2015; pp. 47–52. [Google Scholar] [CrossRef]
  16. Kaya, M.; Senel, E.; Ahmad, A.; Bebek, O. Visual needle tip tracking in 2D US guided robotic interventions. Mechatronics 2019, 57, 129–139. [Google Scholar] [CrossRef]
  17. Mwikirize, C.; Nosher, J.L.; Hacihaliloglu, I. Signal attenuation maps for needle enhancement and localization in 2D ultrasound. Int. J. Comput. Assist. Radiol. Surg. 2018, 13, 363–374. [Google Scholar] [CrossRef]
  18. Mwikirize, C.; Nosher, J.L.; Hacihaliloglu, I. Convolution neural networks for real-time needle detection and localization in 2D ultrasound. Int. J. Comput. Assist. Radiol. Surg. 2018, 13, 647–657. [Google Scholar] [CrossRef]
  19. Xu, F.; Gao, D.; Wang, S.; Zhanwen, A. MLESAC Based Localization of Needle Insertion Using 2D Ultrasound Images. J. Phys. Conf. Ser. 2018, 1004. [Google Scholar] [CrossRef] [Green Version]
  20. Yue, Z.; Liebgott, H.; Cachard, C. Tracking biopsy needle using Kalman filter and RANSAC algorithm with 3D ultrasound. In Proceedings of the Acoustics 2012, Nantes, France, 23–27 April 2012; pp. 231–236. [Google Scholar]
  21. Chatelain, P.; Krupa, A.; Navab, N. 3D ultrasound-guided robotic steering of a flexible needle via visual servoing. In Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015; pp. 2250–2255. [Google Scholar] [CrossRef] [Green Version]
  22. Arif, M.; Moelker, A.; van Walsum, T. Automatic needle detection and real-time Bi-planar needle visualization during 3D ultrasound scanning of the liver. Med. Image Anal. 2019, 53, 104–110. [Google Scholar] [CrossRef]
  23. Younes, H.; Voros, S.; Troccaz, J. Automatic needle localization in 3D ultrasound images for brachytherapy. In Proceedings of the 2018 IEEE 15th International Symposium on Biomedical Imaging (ISBI 2018), Washington, DC, USA, 4–7 April 2018; pp. 1203–1207. [Google Scholar] [CrossRef]
  24. Vrooijink, G.J.; Abayazid, M.; Misra, S. Real-time three-dimensional flexible needle tracking using two-dimensional ultrasound. In Proceedings of the IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 1688–1693. [Google Scholar] [CrossRef]
  25. Waine, M.; Rossa, C.; Sloboda, R.; Usmani, N.; Tavakoli, M. 3D shape visualization of curved needles in tissue from 2D ultrasound images using RANSAC. In Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015; pp. 4723–4728. [Google Scholar] [CrossRef] [Green Version]
  26. Waine, M.; Rossa, C.; Sloboda, R.; Usmani, N.; Tavakoli, M. Needle Tracking and Deflection Prediction for Robot-Assisted Needle Insertion Using 2D Ultrasound Images. J. Med. Robot. Res. 2016, 1, 1640001. [Google Scholar] [CrossRef]
  27. Waine, M.; Rossa, C.; Sloboda, R.; Usmani, N.; Tavakoli, M. 3D Needle Shape Estimation in TRUS-Guided Prostate Brachytherapy Using 2D Ultrasound Images. IEEE J. Biomed. Health Inform. 2015, 2194, 1–11. [Google Scholar] [CrossRef]
  28. Priester, A.M.; Natarajan, S.; Culjat, M. Robotic ultrasound systems in medicine. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 2013, 60, 507–523. [Google Scholar] [CrossRef] [PubMed]
  29. Mignon, P.; Poignet, P.; Troccaz, J. Automatic Robotic Steering of Flexible Needles from 3D Ultrasound Images in Phantoms and Ex Vivo Biological Tissue. Ann. Biomed. Eng. 2018, 46, 1385–1396. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Mignon, P.; Poignet, P.; Troccaz, J. Beveled-tip needle-steering using 3D ultrasound, mechanical-based Kalman filter and curvilinear ROI prediction. In Proceedings of the 2016 14th International Conference on Control, Automation, Robotics and Vision (ICARCV), Phuket, Thailand, 13–15 November 2016; pp. 1–6. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Two methods by using 2D ultrasound for detection.
Figure 1. Two methods by using 2D ultrasound for detection.
Applsci 09 05305 g001
Figure 2. The proposed tip estimation method. STEP 1: 2D transverse US images with needle cross-sections are collected by using RUS; STEP 2: Needle cross-sections are detected and tracked in the successive US images; STEP 3: Needle shape is constructed and its tip is estimated.
Figure 2. The proposed tip estimation method. STEP 1: 2D transverse US images with needle cross-sections are collected by using RUS; STEP 2: Needle cross-sections are detected and tracked in the successive US images; STEP 3: Needle shape is constructed and its tip is estimated.
Applsci 09 05305 g002
Figure 3. The pipeline of curved needle tip estimation. As the ROI configuration finished, needle tracking with needle detection begins step by step. Part A shows the needle tracking by KF, while Part B shows the needle detection to locate the needle tip. Part C shows the shape construction and tip estimation.
Figure 3. The pipeline of curved needle tip estimation. As the ROI configuration finished, needle tracking with needle detection begins step by step. Part A shows the needle tracking by KF, while Part B shows the needle detection to locate the needle tip. Part C shows the shape construction and tip estimation.
Applsci 09 05305 g003
Figure 4. Two cases may generate the errors: (a) the comet tail of needle with binary method process, histogram method process and the histogram of ROI; (b) the poor reflection of the needle with histogram method process and the histogram of ROI.
Figure 4. Two cases may generate the errors: (a) the comet tail of needle with binary method process, histogram method process and the histogram of ROI; (b) the poor reflection of the needle with histogram method process and the histogram of ROI.
Applsci 09 05305 g004
Figure 5. The two steps of the Kalman filter. As the next US image is acquired, the previous state ( x p r e v i o u s , y p r e v i o u s , z p r e v i o u s , x p r e v i o u s , y p r e v i o u s , z p r e v i o u s ) is used to predict the needle position ( x p r e d i c t i o n , y p r e d i c t i o n , z p r e d i c t i o n , which then will be transformed to ( I x _ p r e d i c t i o n , I y _ p r e d i c t i o n ) in the image as the center of ROI. The yellow square is the ROI corresponding to ( I x _ p r e d i c t i o n , I y _ p r e d i c t i o n ) and the red one is the update step in KF by using the measurement data from needle detection to locate the needle with its center ( I x _ u p d a t e , I y _ u p d a t e ) as the needle position. Finally, the measurement state ( x m , y m , z m , x m , y m , z m ) and the current state ( x c , y c , z c , x c , y c , z c ) can be obtained.
Figure 5. The two steps of the Kalman filter. As the next US image is acquired, the previous state ( x p r e v i o u s , y p r e v i o u s , z p r e v i o u s , x p r e v i o u s , y p r e v i o u s , z p r e v i o u s ) is used to predict the needle position ( x p r e d i c t i o n , y p r e d i c t i o n , z p r e d i c t i o n , which then will be transformed to ( I x _ p r e d i c t i o n , I y _ p r e d i c t i o n ) in the image as the center of ROI. The yellow square is the ROI corresponding to ( I x _ p r e d i c t i o n , I y _ p r e d i c t i o n ) and the red one is the update step in KF by using the measurement data from needle detection to locate the needle with its center ( I x _ u p d a t e , I y _ u p d a t e ) as the needle position. Finally, the measurement state ( x m , y m , z m , x m , y m , z m ) and the current state ( x c , y c , z c , x c , y c , z c ) can be obtained.
Applsci 09 05305 g005
Figure 6. The relationship of the frames.
Figure 6. The relationship of the frames.
Applsci 09 05305 g006
Figure 7. The devices of the experiment.
Figure 7. The devices of the experiment.
Applsci 09 05305 g007
Figure 8. The phantom used in the experiment and the movement of the probe.
Figure 8. The phantom used in the experiment and the movement of the probe.
Applsci 09 05305 g008
Figure 9. Experiment in chicken breast with a 4 mm scan interval.
Figure 9. Experiment in chicken breast with a 4 mm scan interval.
Applsci 09 05305 g009
Figure 10. The experiment with different scan intervals on the four platforms.
Figure 10. The experiment with different scan intervals on the four platforms.
Applsci 09 05305 g010
Table 1. Scan lengths with different scan intervals.
Table 1. Scan lengths with different scan intervals.
Scan IntervalScan LengthPoints
1 mm160 mm160
2 mm159 mm80
3 mm160 mm54
4 mm157 mm40
5 mm156 mm32
6 mm157 mm27
7 mm155 mm23
8 mm153 mm20
Table 2. The results of tip estimation (mm).
Table 2. The results of tip estimation (mm).
Accuracy
IntervalsWaterPhantomPorkChicken
1 mm 0.32 ± 0.10 0.33 ± 0.15 0.37 ± 0.07 0.29 ± 0.09
2 mm 0.36 ± 0.21 0.41 ± 0.16 0.31 ± 0.05 0.31 ± 0.12
3 mm 0.45 ± 0.16 0.46 ± 0.18 0.56 ± 0.08 0.33 ± 0.09
4 mm 0.55 ± 0.16 0.59 ± 0.31 0.68 ± 0.14 0.38 ± 0.21
5 mm 0.60 ± 0.13 0.59 ± 0.29 0.90 ± 0.20 0.40 ± 0.14
6 mm 0.69 ± 0.13 0.48 ± 0.09 0.99 ± 0.13 0.55 ± 0.11
7 mm 0.83 ± 0.17 0.62 ± 0.14 0.84 ± 0.30 0.52 ± 0.17
8 mm 0.95 ± 0.11 0.73 ± 0.26 1.06 ± 0.18 0.81 ± 0.19

Share and Cite

MDPI and ACS Style

Li, Z.; Song, S.; Liu, L.; Meng, M.Q.-H. Tip Estimation Method in Phantoms for Curved Needle Using 2D Transverse Ultrasound Images. Appl. Sci. 2019, 9, 5305. https://doi.org/10.3390/app9245305

AMA Style

Li Z, Song S, Liu L, Meng MQ-H. Tip Estimation Method in Phantoms for Curved Needle Using 2D Transverse Ultrasound Images. Applied Sciences. 2019; 9(24):5305. https://doi.org/10.3390/app9245305

Chicago/Turabian Style

Li, Zihao, Shuang Song, Li Liu, and Max Q.-H. Meng. 2019. "Tip Estimation Method in Phantoms for Curved Needle Using 2D Transverse Ultrasound Images" Applied Sciences 9, no. 24: 5305. https://doi.org/10.3390/app9245305

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop