TIME SYNCHRONIZATION FOR SHARED EXTENDED REALITY EXPERIENCES

    公开(公告)号:US20250068374A1

    公开(公告)日:2025-02-27

    申请号:US18481804

    申请日:2023-10-05

    Applicant: Snap Inc.

    Abstract: A first extended reality (XR) device and a second XR device are colocated in an environment. The first XR device captures sensory data of a wearer of the second XR device. The sensory data is used to determine a time offset between a first clock of the first XR device and a second clock of the second XR device. The first clock and the second clock are synchronized based on the time offset and a shared coordinate system is established. The shared coordinate system enables alignment of virtual content that is simultaneously presented by the first XR device and the second XR device based on the synchronization of the first clock and the second clock.

    Energy-efficient adaptive 3D sensing

    公开(公告)号:US12001024B2

    公开(公告)日:2024-06-04

    申请号:US18299923

    申请日:2023-04-13

    Applicant: Snap Inc.

    CPC classification number: G02B27/0172 G06F3/013 G06T19/006 G02B2027/0138

    Abstract: An energy-efficient adaptive 3D sensing system. The adaptive 3D sensing system includes one or more cameras and one or more projectors. The adaptive 3D sensing system captures images of a real-world scene using the one or more cameras and computes depth estimates and depth estimate confidence values for pixels of the images. The adaptive 3D sensing system computes an attention mask based on the one or more depth estimate confidence values and commands the one or more projectors to send a distributed laser beam into one or more areas of the real-world scene based on the attention mask. The adaptive 3D sensing system captures 3D sensing image data of the one or more areas of the real-world scene and generates 3D sensing data for the real-world scene based on the 3D sensing image data.

    AUGMENTED REALITY GUIDED DEPTH ESTIMATION

    公开(公告)号:US20220375110A1

    公开(公告)日:2022-11-24

    申请号:US17529527

    申请日:2021-11-18

    Applicant: Snap Inc.

    Abstract: A method for AR-guided depth estimation is described. The method includes identifying a virtual object rendered in a first frame that is generated based on a first pose of an augmented reality (AR) device, determining a second pose of the AR device, the second pose following the first pose, identifying an augmentation area in the second frame based on the virtual object rendered in the first frame, and the second pose, determining depth information for the augmentation area in the second frame, and rendering the virtual object in the second frame based on the depth information.

    AUGMENTED REALITY GUIDED DEPTH ESTIMATION

    公开(公告)号:US20250157064A1

    公开(公告)日:2025-05-15

    申请号:US19027036

    申请日:2025-01-17

    Applicant: Snap Inc.

    Abstract: A method for AR-guided depth estimation is described. The method includes identifying a virtual object rendered in a first frame that is generated based on a first pose of an augmented reality (AR) device, determining a second pose of the AR device, the second pose following the first pose, identifying an augmentation area in the second frame based on the virtual object rendered in the first frame, and the second pose, determining depth information for the augmentation area in the second frame, and rendering the virtual object in the second frame based on the depth information.

    Collaborative augmented reality eyewear with ego motion alignment

    公开(公告)号:US12299933B2

    公开(公告)日:2025-05-13

    申请号:US18442772

    申请日:2024-02-15

    Applicant: Snap Inc.

    Abstract: Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.

    REVOLVING XR EYEWEAR DISPLAY
    20.
    发明申请

    公开(公告)号:US20240418999A1

    公开(公告)日:2024-12-19

    申请号:US18815484

    申请日:2024-08-26

    Applicant: Snap Inc.

    Abstract: An extended Reality (XR) display system includes a Light Emitting Diode (LED) display controller, and a Light Emitting Diode (LED) near-eye display element operatively coupled to the LED display driver. The LED near-eye display element includes one or more motors and an LED array operably connected to the one or more motors. During operation, the LED display driver receives video data including a rendered virtual object of an XR experience and generates LED array control signals based on the video data, the LED array control signals causing one or more LEDs of the LED array to be energized in a sequence. The LED display driver also generates synchronized motor control signals and simultaneously communicates the LED array control signals to the LED array and the synchronized motor control signals to the one or more motors causing the LED near-eye display element to display the rendered virtual object.

Patent Agency Ranking