Abstract:
The present invention relates to an apparatus (10) for tracking a position of an interventional device (11) respective an image plane (12) of an ultrasound field. The position includes an out-of-plane distance (Dop). A geometry-providing unit (GPU) includes a plurality of transducer-to-distal-end lengths (Ltde1 . . . n), each length corresponding to a predetermined distance (Ltde) between a distal end (17, 47) of an interventional device (11, 41) and an ultrasound detector (16, 46) attached to the interventional device, for each of a plurality of interventional device types (T1 . . . N). An image fusion unit (IFU) receives data indicative of the type (T) of the interventional device being tracked; and based on the type (T): selects from the geometry-providing unit (GPU), a corresponding transducer-to-distal-end length (Ltde); and indicates in a reconstructed ultrasound image (RUI) both the out-of-plane distance (Dop) and the transducer-to-distal-end length (Ltde) for the interventional device within the ultrasound field.
Abstract:
A transperinealprostate intervention device comprises a prostate intervention instrument (10), a transrectal ultrasound (TRUS) probe (12), and a mechanical or optical coordinate measurement machine (CMM) (20) attached to the TRUS probe and configured to track the prostate intervention instrument. The CMM may include an articulated arm with a plurality of encoding joints (24), an anchor end (30) attached to the TRUS probe, and a movable end (32) attached to the prostate intervention instrument. The prostate intervention instrument may, for example, be a biopsy needle, a brachytherapy seed delivery instrument, a tissue ablation instrument, or a hollow cannula. An electronic processor (40) computes a predicted trajectory (54) of the prostate intervention instrument in a frame of reference of the TRUS probe using the CMM attached to the TRUS probe. A representation (56) of the predicted trajectory is superimposed on a prostate ultrasound image (50) generated from ultrasound data collected by the TRUS probe.
Abstract:
A tool navigation system employing an ultrasound probe, an ultrasound scanner, an interventional tool, a tool tracker and an image navigator. In operation, the ultrasound probe generates an acoustic image plane for scanning an anatomical region, and the ultrasound scanner generates an ultrasound image of the anatomical region from a scan of the anatomical region. During the scan, the interventional tool is navigated within the anatomical region relative to the acoustic image plane, and the tool tracker tracks a position of the interventional tool relative to the acoustic image plane. The image navigator displays a graphical icon within the ultrasound image for illustrating a tracking of the interventional tool relative to the acoustic image plane by the tool tracker. Additionally, a special attachment on the US probe with PZT type sensors can help this navigation by emitting acoustic waves off the central axis of the imaging plane.
Abstract:
A medical device includes a conductive body having a surface and a sensor conformally formed on the surface and including a piezoelectric polymer formed about a portion of the surface and following a contour of the surface. The piezoelectric polymer is configured to generate or receive ultrasonic energy. Electrical connections conform to the surface and are connected to an electrode in contact with the piezoelectric polymer. The electrical connections provide connections to the piezo electric polymer and are electrically isolated from the conductive body over a portion of the surface.
Abstract:
An adaptor device includes a first connector (106) configured to interface with an ultrasound probe and a second connector (108) configured to interface with an ultrasound console. An array of lines (120) connects the first connector to the second connector. A pulse generator or generators (110, 112) are configured to output trigger signals responsive to a signal on one or more of the array of lines. An external output (114, 116) is configured to output the trigger signals.
Abstract:
A system for providing a remote center of motion for robotic control includes a marker device (104) configured to include one or more shapes (105) to indicate position and orientation of the marker device in an image collected by an imaging system (110). The marker device is configured to receive or partially receive an instrument (102) therein, the instrument being robotically guided. A registration module (117) is configured to register a coordinate system of the image with that of the robotically guided instrument using the marker device to define a position in a robot coordinate system (132) where a virtual remote center of motion (140) exists. Control software (136) is configured to control a motion of the robotically guided instrument wherein the virtual remote center of motion constrains the motion of a robot (130).
Abstract:
A system for highlighting an instrument in an image includes a probe (122) for transmitting and receiving ultrasonic energy to and from a volume and a marker device (120) configured to respond to a received ultrasonic signal and emit an ultrasonic signal after a delay. The ultrasonic signal includes one or more pulses configured to generate a marker, when rendered, of a given size at a position within an image. A medical instrument (102) is disposed in the volume and includes the marker device. A control module (124) is stored in memory and is configured to interpret the ultrasonic energy received from the probe and from the marker device to determine a three dimensional location of the medical instrument and to highlight the three dimensional location of the marker device with the marker in the image.
Abstract:
An interventional system employing an interventional tool (20) having a tracking point, and an imaging system (30) operable for generating at least one image of at least a portion of the interventional tool (20) relative to an anatomical region of a body. The system further employs a tracking system (40) operable for tracking any movements of the interventional tool (20) and the imaging system (30) within a spatial frame reference relative to the anatomical region of the body, wherein the tracking system (40) is calibrated to the interventional tool (20) and the imaging system (30) and a tracking quality monitor (52) operable for monitoring a tracking quality of the tracking system (40)s as a function of a calibrated location error for each image between a calibrated tracking location of the tracking point within the spatial reference frame and an image coordinate location of the tracking point in the image.
Abstract:
A controller (220) for determining a shape of an interventional medical device in an interventional medical procedure based on a location of the interventional medical device includes a memory (221) that stores instructions and a processor (222) that executes the instructions. The instructions cause a system (200) that includes the controller (220) to implement a process that includes obtaining (S320) the location of the interventional medical device (201) and obtaining (S330) imagery of a volume that includes the interventional medical device. The process also includes applying (S340), based on the location of the interventional medical device (201), image processing to the imagery to identify the interventional medical device (201) including the shape of the interventional medical device (201). The process further includes (S350) segmenting the interventional medical device (201) to obtain a segmented representation of the interventional medical device (201). The segmented representation of the interventional medical device (201) is overlaid (S360) on the imagery.
Abstract:
A tool navigation system employing an ultrasound probe, an ultrasound scanner, an interventional tool, a tool tracker and an image navigator. In operation, the ultrasound probe generates an acoustic image plane for scanning an anatomical region, and the ultrasound scanner generates an ultrasound image of the anatomical region from a scan of the anatomical region. During the scan, the interventional tool is navigated within the anatomical region relative to the acoustic image plane, and the tool tracker a tracks of a position of the interventional tool relative to the acoustic image plane. The image navigator displays a graphical icon within the ultrasound image for illustrating a tracking of the interventional tool relative to the acoustic image plane.