TILEABLE STRUCTURED LIGHT PROJECTION FOR WIDE FIELD-OF-VIEW DEPTH SENSING

    公开(公告)号:US20180205937A1

    公开(公告)日:2018-07-19

    申请号:US15409314

    申请日:2017-01-18

    Applicant: Oculus VR, LLC

    Abstract: A depth camera assembly (DCA) includes a projector, a detector and a controller. The projector emits a tiled structured light (SL) pattern onto a local area. Each illumination source of the projector includes one or more light emitters and an augmented diffractive optical element (ADOE) designed with a pattern mask. The ADOE diffracts at least a portion of light beams emitted from the light emitters to form a first SL pattern projection having a field-of-view corresponding to a first tileable boundary. The pattern mask prevents projection of light that would otherwise be diffracted outside the first tileable boundary. The first SL pattern projection is combined with at least a second SL pattern projection into the tiled SL pattern illuminating objects in the local area. The detector captures images of the objects illuminated by the SL pattern. The controller determines depth information for the objects using the captured images.

    Directed Display Architecture
    4.
    发明申请

    公开(公告)号:US20170139211A1

    公开(公告)日:2017-05-18

    申请号:US14944495

    申请日:2015-11-18

    Applicant: Oculus VR, LLC

    Abstract: A head-mounted display (HMD) includes an electronic display element, a microlens array, and an optics block. The electronic display element outputs image light via sub-pixels having different colors, the sub-pixels separated from each other by a dark space region. The sub-pixels have associated emission distributions that describe ranges of angles of light emitted from the plurality of sub-pixels. The microlens array includes microlenses that are each coupled to at least one corresponding sub-pixel, of the sub-pixels, where the microlenses concentrate the emission distributions and direct the emission distributions toward a target region. The optics block, which is located in the target region optically corrects the image light and directs the optically corrected image light from the microlens array to an exit pupil of the HMD corresponding to a location of an eye of a user of the HMD.

    EYE TRACKING ARCHITECTURE FOR COMMON STRUCTURED LIGHT AND TIME-OF-FLIGHT FRAMEWORK

    公开(公告)号:US20180196509A1

    公开(公告)日:2018-07-12

    申请号:US15400256

    申请日:2017-01-06

    Applicant: Oculus VR, LLC

    CPC classification number: G06F3/013 G02B27/0172 G02B2027/0138 H04N5/33

    Abstract: A head-mounted display (HMD) includes an eye tracking system that determines user's eye tracking information based on combining structured light information and time-of-flight information. The eye tracking system includes an illumination source, an imaging device and a controller. The illumination source modulates a structured light by a carrier signal and illuminates a user's eye with the modulated structured light. The imaging device includes a detector that captures the modulated structured light. The detector comprises a plurality of pixel groups, each pixel group receiving a control signal determining when a pixel group captures light, the control signal causing pixel groups to capture light at different times relative to other pixel groups. The controller determines phases of the carrier signal based on intensities of light received by different pixel groups and generates depth information related to surfaces of the user's eye, which is used to model and track the user's eye.

Patent Agency Ranking