Abstract:
A system for providing a perspective for a virtual image includes an intraoperative imaging system (110) having a transducer (146) configured to generate an image data set for a region. A shape sensing enabled device (102) is configured to have at least a portion of the shape sensing enabled device positioned relative to the region. The shape sensing enabled device has a coordinate system registered with a coordinate system of the intraoperative imaging system. An image generation module (148) is configured to render a virtual image (152) of at least a portion of the region using the image data set wherein the virtual image includes a vantage point relative to a position on the shape sensing enabled device.
Abstract:
A system may generally comprise a tracking device, an ultrasound device and a processing unit. A position and orientation of the ultrasound device may be traceable by the tracking device. The processing unit may be configured (i) to receive 3D information of a region of interest in relation to a marker, with both the region of interest and the marker being located within a body, (ii) to determine the position of the marker relative to the ultrasound device based on an ultrasound image of the body including the marker, and (iii) to determine the position and orientation of the ultrasound device relative to the tracking device. The system may further comprise a visualization device and the processing unit may further be configured to generate a visualization of the region of interest in relation to an outer surface of the body.
Abstract:
A mixed reality device (30) employing a mixed reality display (40) for visualizing a virtual augmentation of a physical anatomical model, and a mixed reality controller (50) for controlling a visualization by the mixed reality display (40) of the virtual augmentation of the physical anatomical model including a mixed reality interaction between the physical anatomical model and a virtual anatomical model. The mixed reality controller (50) may employ a mixed reality registration module (51) for controlling a spatial registration between the physical anatomical model within a physical space and the virtual anatomical model within a virtual space, and a mixed reality interaction module for controlling the mixed reality interaction between the physical anatomical model and the virtual anatomical model based on the spatial registration between the physical anatomical model within the physical space and the virtual anatomical model within the virtual space.
Abstract:
An optical shape sensing system includes an attachment device (130) coupled at an anatomical position relative to a bone. An optical shape sensing fiber (102) is coupled to the attachment device and configured to identify a position and orientation of the attachment device. An optical shape sensing module (115) is configured to receive feedback from the optical shape sensing fiber and register the position and orientation of the attachment device relative to an anatomical map.
Abstract:
A transperinealprostate intervention device comprises a prostate intervention instrument (10), a transrectal ultrasound (TRUS) probe (12), and a mechanical or optical coordinate measurement machine (CMM) (20) attached to the TRUS probe and configured to track the prostate intervention instrument. The CMM may include an articulated arm with a plurality of encoding joints (24), an anchor end (30) attached to the TRUS probe, and a movable end (32) attached to the prostate intervention instrument. The prostate intervention instrument may, for example, be a biopsy needle, a brachytherapy seed delivery instrument, a tissue ablation instrument, or a hollow cannula. An electronic processor (40) computes a predicted trajectory (54) of the prostate intervention instrument in a frame of reference of the TRUS probe using the CMM attached to the TRUS probe. A representation (56) of the predicted trajectory is superimposed on a prostate ultrasound image (50) generated from ultrasound data collected by the TRUS probe.
Abstract:
A steerable introduction system for deploying an interventional tool (e.g., a replacement valve) within an anatomical object (e.g. a heart). The steerable introduction system employs a steerable introducer (20) including an end-effector for positioning the interventional tool within the anatomical object (e.g., the end-effector passively guides or actively steers the interventional tool within the anatomical object). The steerable introduction system further employs an image guidance workstation (120) controlling an actuation of a translation, a pivoting and/or a rotation of the end-effector within the anatomical object responsive to surgical image data illustrative of a position of the end-effector within the anatomical object (e.g., ultrasound or X-ray image data illustrative of a surgical position of the end-effector within a heart). The motion actuation by the image guidance workstation (120) of the end-effector facilitates a coaxial alignment and/or a coplanar alignment of the interventional tool and a structure of the anatomical object (e.g., a diseased aortic valve of a heart).
Abstract:
A robotic punch system employs a robot unit and a control unit. The robot unit includes a robot and an endoscopic punch mounted to the robot. The endoscopic punch includes a calibrated spatial alignment of an endoscope and a punch. The control unit commands the robot for deploying the endoscopic punch in executing a puncture of an anatomical tissue.
Abstract:
A system for providing a remote center of motion for robotic control includes a marker device (104) configured to include one or more shapes (105) to indicate position and orientation of the marker device in an image collected by an imaging system (110). The marker device is configured to receive or partially receive an instrument (102) therein, the instrument being robotically guided. A registration module (117) is configured to register a coordinate system of the image with that of the robotically guided instrument using the marker device to define a position in a robot coordinate system (132) where a virtual remote center of motion (140) exists. Control software (136) is configured to control a motion of the robotically guided instrument wherein the virtual remote center of motion constrains the motion of a robot (130).
Abstract:
A system for evaluating patency includes a light sensor (128) positionable relative to a blood vessel to receive light from the blood vessel and convert the light into an image signal. Aphoto-plethysmography (PPG) interpretation module (115) is configured to receive the image signal and output pixel values in an image representing PPG information. An image generation module(148) is coupled to the PPG interpretation module to receive the pixel values and generate a PPG map to be output to a display for analysis.
Abstract:
An image registration system an endoscope (12) and an endoscope controller (22). In operation, the endoscope (12) generates an intra-operative endoscopic image (14) of a vessel tree (e.g., an arterial tree or a venous tree) within an anatomical region, and the endoscope controller (22) image registers the intra-operative endoscopic image (14) of the vessel tree to a pre-operative three-dimensional image (44) of the vessel tree within the anatomical region. The image registration includes an image matching of a graphical representation of each furcation of the vessel tree within the intra-operative endoscopic image (14) of the vessel tree to a graphical representation of each furcation of the vessel tree within the pre-operative three-dimensional image (44) of the vessel tree.