Augmented reality interventional system providing contextual overylays

    公开(公告)号:US12169881B2

    公开(公告)日:2024-12-17

    申请号:US18073621

    申请日:2022-12-02

    Abstract: An augmented reality interventional system which provides contextual overlays (116) to assist or guide a user (101) or enhance the performance of the interventional procedure by the user that uses an interactive medical device (102) to perform the interventional procedure. The system includes a graphic processing module (110) that is configured to generate at least one contextual overlay on an augmented reality display device system (106). The contextual overlays may identify a component (104) or control of the interactive medical device. The contextual overlays may also identify steps of a procedure to be performed by the user and provide instructions for performance of the procedure. The contextual overlays may also identify a specific region of the environment to assist or guide the user or enhance the performance of the interventional procedure by identifying paths or protocols to reduce radiation exposure.

    Guided anatomical visualization for endoscopic procedures

    公开(公告)号:US11937883B2

    公开(公告)日:2024-03-26

    申请号:US17115850

    申请日:2020-12-09

    CPC classification number: A61B34/20 A61B1/00006 A61B1/00045 A61B2034/2055

    Abstract: Various embodiments of the present disclosure encompass a visual endoscopic guidance device employing an endoscopic viewing controller (20) for controlling a display of an endoscopic view (11) of an anatomical structure, and a visual guidance controller (130) for controlling of a display one or more guided manipulation anchors (50-52) within the display of the endoscopic view (11) of the anatomical structure. A guided manipulation anchor (50-52) is representative of location marking and/or a motion directive of a guided manipulation of the anatomical structure. The visual guidance controller (130) further controls a display of a hidden feature anchor relative to the display of the endoscopic view (11) of the anatomical structure. The hidden feature anchor (53) being representative of a position (e.g., a location and/or an orientation) of a guided visualization of the hidden feature of the anatomical structure.

    Robotic instrument guide integration with an acoustic probe

    公开(公告)号:US11406459B2

    公开(公告)日:2022-08-09

    申请号:US16628719

    申请日:2018-07-05

    Abstract: A robotic acoustic probe for application with an interventional device (60). The robotic acoustic probe employs an acoustic probe (20) including a imaging platform (21) having a device insertion port (22) defining a device insertion port entry (23) and device insertion port exit (24), and further including an acoustic transducer array (25) are disposed relative the device insertion port exit (24). The robotic acoustic probe further employs a robotic instrument guide (40) including a base (41) mounted to the imaging platform (21) relative to the device insertion port entry (23), and an end-effector (45) coupled to the base (41) and transitionable between a plurality of poses relative to a remote-center-of-motion (49). The end-effector (45) defines an interventional device axis (48) extending through the device insertion port (22), and the remote-center-of-motion (49) is located on the interventional device axis (48) adjacent the device insertion port exit (24).

    Spatial visualization of internal mammary artery during minimally invasive bypass surgery

    公开(公告)号:US10772684B2

    公开(公告)日:2020-09-15

    申请号:US15116929

    申请日:2015-01-28

    Abstract: A system for visualizing an anatomical target includes an imaging device having a field of view for imaging a portion of a region to be imaged. A three-dimensional model is generated from pre-operative or intra-operative images and includes images of an internal volume in the region to be imaged not visible in the images from the imaging device. An image processing module is configured to receive images from the imaging device such that field of view images of the imaging device are stitched together to generate a composite image of the region to be imaged. The composite image is registered to real-time images and the three-dimensional model. An internal view module is configured to generate for display an internal view of the internal volume at one or more positions along the composite image.

    Image integration and robotic endoscope control in X-ray suite

    公开(公告)号:US10702346B2

    公开(公告)日:2020-07-07

    申请号:US15324001

    申请日:2015-07-14

    Abstract: A workstation for calibrating a robotic instrument (42) has a distal tip (46) within an X-ray image space (35) of an X-ray modality (32). The workstation employs a calibration controller (50) for calibrating a remote center of motion (RCM) length of the robotic instrument (42) responsive to X-ray images (36) of different poses of the distal tip (46) of the robotic instrument (42) within the X-ray image space (35), and further employs a robotic instrument controller (40) for controlling a guidance of the robotic instrument (42) within the X-ray image space (35) from the RCM calibration. The robotic instrument (42) may include an endoscope whereby the calibration controller (50) is further employed to calibrate a focal length of the robotic instrument (42) responsive to the X-ray images (36) and one or more endoscope image(s) (48) of the X-ray image space (35) for controlling the guidance of the robotic instrument (42) within the X-ray image space (35) from the RCM/focal length calibrations.

    System, controller and method using virtual reality device for robotic surgery

    公开(公告)号:US10646289B2

    公开(公告)日:2020-05-12

    申请号:US16065928

    申请日:2016-12-27

    Abstract: A control unit is provided for a surgical robot system, including a robot configured to operate an end-effector in a surgical site of a patient. The control unit includes a processor configured to transmit acquired live images of a patient, received from an image acquisition device, to a virtual reality (VR) device for display; to receive input data from the VR device, including tracking data from a VR tracking system of the VR device based on a user's response to the live images displayed on a viewer of the display unit of the VR device; to process the input data received from the VR device to determine a target in the patient; to determine a path for the end-effector to reach the target based upon the live images and the processed input data; and to transmit control signals to cause the robot to guide the end-effector to the target via the determined path.

Patent Agency Ranking