Stage studio for immersive 3-D video capture

    公开(公告)号:US11785200B1

    公开(公告)日:2023-10-10

    申请号:US18121486

    申请日:2023-03-14

    CPC classification number: H04N13/282 H04N23/51 H04N23/90

    Abstract: Embodiments are directed providing a stage studio for immersive 3-D video capture. A two-dimensional video of a scene may be captured with frame cameras oriented towards a center of the scene. Paths may be scanned across objects in the scene with signal beams oriented towards the center of the scene. Events may be generated based on signal beams that are reflected by the objects and detected by event cameras oriented towards the center of the scene. Trajectories may be generated based on the paths and the events. A scene that includes a representation of the objects based on the captured two-dimensional video and the trajectories may be generated such that a position and an orientation of the represented objects in the scene are based on the trajectories.

    ASSOCIATION OF CONCURRENT TRACKS USING GRAPH CROSSINGS

    公开(公告)号:US20230274523A1

    公开(公告)日:2023-08-31

    申请号:US18113283

    申请日:2023-02-23

    CPC classification number: G06V10/421 G06T7/73 G06V2201/07

    Abstract: Embodiments are directed to the association of concurrent tracks using graph crossings. Signal beams may be employed to scan paths across an object such that sensors separately detect signals from the signal beams reflected by the object. Crossing points may be determined based on a plurality of trajectories that intersect each other during the scan of the object. Graphs may be generated based on the portion of trajectories and the crossing points such that each edge in the graphs corresponds to a crossing point and such that each node in the one or more graphs corresponds to a trajectory. The graphs may be compared to determine one or more matched graphs that may share a common topology. Common trajectories may be determined based on the matched graphs such that each common trajectory may be associated with a same path across the object and a separate sensor.

    Automatic parameter adjustment for scanning event cameras

    公开(公告)号:US12148185B2

    公开(公告)日:2024-11-19

    申请号:US17865794

    申请日:2022-07-15

    Abstract: Embodiments are directed to parameter adjustment for sensors. A calibration model and a calibration profile for a sensor may be provided. Calibration parameters associated with the sensor may be determined based on the calibration profile. The sensor may be configured to use a value of the calibration parameter based on the calibration profile. Trajectories may be generated based on a stream of events from the sensor. Metrics associated with the sensor events or the trajectories may be determined. If a metric value may be outside of a control range, further actions may be iteratively performed, including: modifying the value of the calibration parameter based on the calibration model; configuring the sensor to use the modified value of the calibration parameter; redetermining the metrics based on additional trajectories; if the metric is within a control range, the iteration may be terminated and the calibration profile may be updated.

    CALIBRATION OF SENSOR POSITION OFFSETS BASED ON ROTATION AND TRANSLATION VECTORS FOR MATCHED TRAJECTORIES

    公开(公告)号:US20230003549A1

    公开(公告)日:2023-01-05

    申请号:US17856690

    申请日:2022-07-01

    Abstract: Embodiments are directed to calibrating multi-view triangulation systems that perceive surfaces and objects based on reflections of one or more scanned laser beams that are continuously sensed by two or more sensors. In addition to sampling and triangulating points from a spline formed by an unbroken line trajectory of a laser beam, the calibration system samples and triangulates a corresponding velocity vector. Iterative reduction is performed on velocity vectors instead of points or splines. The velocity vector includes directions and magnitudes along a trajectory of a scanning laser beam which are used to determine the actual velocities. Translation and rotation vectors are based on the velocity vectors for matching trajectories determined for two or more sensors having offset physical positions, which are used to calibrate sensor offset errors associated with the matching trajectories provided to a modeling engine.

    CALIBRATION OF SCANNING 3-D PERCEPTION SYSTEMS

    公开(公告)号:US20250027765A1

    公开(公告)日:2025-01-23

    申请号:US18671625

    申请日:2024-05-22

    Abstract: Embodiments are directed to calibrating a system. A plurality of paths may be scanned across a first object and a second object where the first object is between the second object and scanner. A first portion of trajectories that correspond to a first portion of the paths that trace across the first object may be determined. A second portion trajectories that correspond to a second portion of the paths that trace across the second object may be determined. First endpoints of the first portion of the trajectories may be determined and second endpoints of the second portion of the trajectories may be determined. Rays may be determined based on the second endpoints and the first endpoints. An origin point of the beam generator may be determined based on an intersection of the rays. Accordingly, the system may be calibrated based on the origin point of the beam generator.

    ASSOCIATION OF CONCURRENT TRACKS ACROSS MULTIPLE VIEWS

    公开(公告)号:US20230169683A1

    公开(公告)日:2023-06-01

    申请号:US17991610

    申请日:2022-11-21

    CPC classification number: G06T7/74 G01S17/89 G06T2207/30241

    Abstract: Embodiments are directed to the association of concurrent tracks across multiple views for sensing objects. A sensing system that employs signal beams to scan paths across an object may be provided such that two or more sensors separately detect signals reflected by the object. Events may be determined based on the detected signals such that the events include pairs of events that correspond to pairs of sensors. Essential matrices may be generated based on the positions of each pair of sensors. The pairs of events associated with the pairs of sensors may be compared based on the essential matrices of the sensors. Scores for the pairs of events may be provided based on the comparison. If a score for a pair of events may be less than a threshold value, each event in the pair of events may be associated with a same location on the object.

    Calibration of sensor position offsets based on rotation and translation vectors for matched trajectories

    公开(公告)号:US12111180B2

    公开(公告)日:2024-10-08

    申请号:US17856690

    申请日:2022-07-01

    CPC classification number: G01C25/00 G01C3/02

    Abstract: Embodiments are directed to calibrating multi-view triangulation systems that perceive surfaces and objects based on reflections of one or more scanned laser beams that are continuously sensed by two or more sensors. In addition to sampling and triangulating points from a spline formed by an unbroken line trajectory of a laser beam, the calibration system samples and triangulates a corresponding velocity vector. Iterative reduction is performed on velocity vectors instead of points or splines. The velocity vector includes directions and magnitudes along a trajectory of a scanning laser beam which are used to determine the actual velocities. Translation and rotation vectors are based on the velocity vectors for matching trajectories determined for two or more sensors having offset physical positions, which are used to calibrate sensor offset errors associated with the matching trajectories provided to a modeling engine.

    STAGE STUDIO FOR IMMERSIVE 3-D VIDEO CAPTURE

    公开(公告)号:US20230319255A1

    公开(公告)日:2023-10-05

    申请号:US18121486

    申请日:2023-03-14

    CPC classification number: H04N13/282 H04N23/51 H04N23/90

    Abstract: Embodiments are directed providing a stage studio for immersive 3-D video capture. A two-dimensional video of a scene may be captured with frame cameras oriented towards a center of the scene. Paths may be scanned across objects in the scene with signal beams oriented towards the center of the scene. Events may be generated based on signal beams that are reflected by the objects and detected by event cameras oriented towards the center of the scene. Trajectories may be generated based on the paths and the events. A scene that includes a representation of the objects based on the captured two-dimensional video and the trajectories may be generated such that a position and an orientation of the represented objects in the scene are based on the trajectories.

    AUTOMATIC PARAMETER ADJUSTMENT FOR SCANNING EVENT CAMERAS

    公开(公告)号:US20230015889A1

    公开(公告)日:2023-01-19

    申请号:US17865794

    申请日:2022-07-15

    Abstract: Embodiments are directed to parameter adjustment for sensors. A calibration model and a calibration profile for a sensor may be provided. Calibration parameters associated with the sensor may be determined based on the calibration profile. The sensor may be configured to use a value of the calibration parameter based on the calibration profile. Trajectories may be generated based on a stream of events from the sensor. Metrics associated with the sensor events or the trajectories may be determined. If a metric value may be outside of a control range, further actions may be iteratively performed, including: modifying the value of the calibration parameter based on the calibration model; configuring the sensor to use the modified value of the calibration parameter; redetermining the metrics based on additional trajectories; if the metric is within a control range, the iteration may be terminated and the calibration profile may be updated.

Patent Agency Ranking