STAGE STUDIO FOR IMMERSIVE 3-D VIDEO CAPTURE

    公开(公告)号:US20230319255A1

    公开(公告)日:2023-10-05

    申请号:US18121486

    申请日:2023-03-14

    CPC classification number: H04N13/282 H04N23/51 H04N23/90

    Abstract: Embodiments are directed providing a stage studio for immersive 3-D video capture. A two-dimensional video of a scene may be captured with frame cameras oriented towards a center of the scene. Paths may be scanned across objects in the scene with signal beams oriented towards the center of the scene. Events may be generated based on signal beams that are reflected by the objects and detected by event cameras oriented towards the center of the scene. Trajectories may be generated based on the paths and the events. A scene that includes a representation of the objects based on the captured two-dimensional video and the trajectories may be generated such that a position and an orientation of the represented objects in the scene are based on the trajectories.

    Virtual fences in air, water, and space

    公开(公告)号:US12276730B2

    公开(公告)日:2025-04-15

    申请号:US18504052

    申请日:2023-11-07

    Abstract: Embodiments are directed to perceiving scene features using event sensors and image sensors. Enclosures mounted may be on structures arranged to establish a boundary where each enclosure includes an event camera, a beam generator, or a frame camera. The beam generators may be employed to scan paths across objects in a scene that may be outside the boundary. Events may be determined based on detection of beam reflections corresponding to the objects. Trajectories associated with the objects may be determined based on the paths and the events. Objects that may be authorized may be determined based on trajectories associated with authorized objects. Objects in the scene that may be unauthorized may be determined based on trajectories associated with the unauthorized objects. A representation of the unauthorized objects may be determined such that a position and an orientation of the unauthorized objects in the scene may be based on the trajectories.

    VIRTUAL FENCES IN AIR, WATER, AND SPACE
    4.
    发明公开

    公开(公告)号:US20240329248A1

    公开(公告)日:2024-10-03

    申请号:US18504052

    申请日:2023-11-07

    CPC classification number: G01S17/58 G01S7/4808 G01S7/484 G01S17/89 G06V20/44

    Abstract: Embodiments are directed to perceiving scene features using event sensors and image sensors. Enclosures mounted may be on structures arranged to establish a boundary where each enclosure includes an event camera, a beam generator, or a frame camera. The beam generators may be employed to scan paths across objects in a scene that may be outside the boundary. Events may be determined based on detection of beam reflections corresponding to the objects. Trajectories associated with the objects may be determined based on the paths and the events. Objects that may be authorized may be determined based on trajectories associated with authorized objects. Objects in the scene that may be unauthorized may be determined based on trajectories associated with the unauthorized objects. A representation of the unauthorized objects may be determined such that a position and an orientation of the unauthorized objects in the scene may be based on the trajectories.

    Perceiving scene features using event sensors and image sensors

    公开(公告)号:US11974055B1

    公开(公告)日:2024-04-30

    申请号:US18488123

    申请日:2023-10-17

    CPC classification number: H04N25/47 G06T7/246

    Abstract: Embodiments are directed to perceiving scene features using event sensors and image sensors. Paths may be scanned across objects in a scene with one or more beams. Images of the scene may be captured with frame cameras. Events may be generated based on detection of beam reflections that correspond to the objects. Trajectories may be based on the paths and the events. A distribution of intensity associated with the trajectories may be determined based on the energy associated with the traces in the images. Centroids for the trajectories may be determined based on the distribution of the intensity of energy, a resolution of the frame cameras, or timestamps associated with the events. Enhanced trajectories may be generated based on the centroids such that the enhanced trajectories may be provided to a modeling engine that executes actions based on the enhanced trajectories and the objects.

    FOLDED SINGLE SENSOR 3-D CAPTURE SYSTEM
    6.
    发明公开

    公开(公告)号:US20240040274A1

    公开(公告)日:2024-02-01

    申请号:US18225833

    申请日:2023-07-25

    CPC classification number: H04N25/47 H04N23/95 H04N23/55 H04N13/25 H04N23/555

    Abstract: Embodiments are directed to folded single sensor 3-D capture systems. An enclosure that includes an event camera, one or more beam generators, or an optical focusing system may be provided such that the beam generators may scan a scene. Paths may be scanned across objects in the scene with the beams. Events may be determined based on detection, by the event camera, of beam reflections that correspond to objects in the scene such that each beam reflection may be directed to a separate location on a sensor for the event camera by the optical focusing system. Trajectories may be determined based on the paths and the events such that each trajectory may be a parametric representation of a one-dimensional curve segment in a three-dimensional space. Three-dimensional information that corresponds to the objects may be generated based on the plurality of trajectories.

    Dynamic calibration of 3D acquisition systems

    公开(公告)号:US11704835B2

    公开(公告)日:2023-07-18

    申请号:US17876333

    申请日:2022-07-28

    CPC classification number: G06T7/80

    Abstract: Embodiments are directed to a sensing system that employs beams to scan paths across an object such that sensors may the beams reflected by the scanned object. Events may be provided based on the detected signals and the paths such that each event may be associated with a sensor and event metrics. Crossing points for each sensor may be determined based on where the paths intersect the scanned object such that events associated with each sensor are associated with the crossing points for each sensor. Each crossing point of each sensor may be compared to each correspondent crossing point of each other sensor. Actual crossing points may be determined based on the comparison and the crossing points for each sensor. Position information for each sensor may be determined based on the actual crossing points.

    Perceiving scene features using event sensors and image sensors

    公开(公告)号:US12262127B2

    公开(公告)日:2025-03-25

    申请号:US18618909

    申请日:2024-03-27

    Abstract: Embodiments are directed to perceiving scene features using event sensors and image sensors. Paths may be scanned across objects in a scene with one or more beams. Images of the scene may be captured with frame cameras. Events may be generated based on detection of beam reflections that correspond to the objects. Trajectories may be based on the paths and the events. A distribution of intensity associated with the trajectories may be determined based on the energy associated with the traces in the images. Centroids for the trajectories may be determined based on the distribution of the intensity of energy, a resolution of the frame cameras, or timestamps associated with the events. Enhanced trajectories may be generated based on the centroids such that the enhanced trajectories may be provided to a modeling engine that executes actions based on the enhanced trajectories and the objects.

    PERCEIVING SCENE FEATURES USING EVENT SENSORS AND IMAGE SENSORS

    公开(公告)号:US20250056133A1

    公开(公告)日:2025-02-13

    申请号:US18618909

    申请日:2024-03-27

    Abstract: Embodiments are directed to perceiving scene features using event sensors and image sensors. Paths may be scanned across objects in a scene with one or more beams. Images of the scene may be captured with frame cameras. Events may be generated based on detection of beam reflections that correspond to the objects. Trajectories may be based on the paths and the events. A distribution of intensity associated with the trajectories may be determined based on the energy associated with the traces in the images. Centroids for the trajectories may be determined based on the distribution of the intensity of energy, a resolution of the frame cameras, or timestamps associated with the events. Enhanced trajectories may be generated based on the centroids such that the enhanced trajectories may be provided to a modeling engine that executes actions based on the enhanced trajectories and the objects.

    Dynamic calibration of 3D acquisition systems

    公开(公告)号:US11887340B2

    公开(公告)日:2024-01-30

    申请号:US18222780

    申请日:2023-07-17

    CPC classification number: G06T7/80

    Abstract: Embodiments are directed to a sensing system that employs beams to scan paths across an object such that sensors may the beams reflected by the scanned object. Events may be provided based on the detected signals and the paths such that each event may be associated with a sensor and event metrics. Crossing points for each sensor may be determined based on where the paths intersect the scanned object such that events associated with each sensor are associated with the crossing points for each sensor. Each crossing point of each sensor may be compared to each correspondent crossing point of each other sensor. actual crossing points may be determined based on the comparison and the crossing points for each sensor. Position information for each sensor may be determined based on the actual crossing points.

Patent Agency Ranking