-
公开(公告)号:US20230068660A1
公开(公告)日:2023-03-02
申请号:US17823015
申请日:2022-08-29
Applicant: LabLightAR, Inc.
Inventor: Roger Brent , John Max Kellum , James Ashley
Abstract: A method of operating a procedural training user interface system involves displaying an interactive guided process of a first user using at least one augmented reality (AR) layer through an AR device worn by a second user, where a representation of first user hands is displayed. The second user interactions may be detected during the interactive guided process, where the second user attempts to superimpose second user hands on the representation of the first user hands in the at least one AR layer. The interactive guided process of the second user may then be displayed using the AR layer through an AR device on the first user and the AR device on the second user. If the first user hands and the second user hands are not superimposed in the AR layer, the first user or the second user may be notified to take corrective action.
-
公开(公告)号:US11386303B2
公开(公告)日:2022-07-12
申请号:US17179208
申请日:2021-02-18
Applicant: LabLightAR, Inc.
Inventor: Roger Brent , Jamie Douglas Tremaine , John Max Kellum
Abstract: A method of operating a procedural language and content generation system that involves correlating environment objects and object movement to input controls through operation of a correlator, operating an interpreter to evaluate the correlation of the input controls and object/object movement against known libraries to generate programmatic instructions, storing the programmatic instructions as an instruction set, transforming the instruction set into executable commands through a compiler, and configuring control logic to perform the executable commands in response to receiving detected environment objects and detected object movement from an image processor.
-
公开(公告)号:US11727238B2
公开(公告)日:2023-08-15
申请号:US17471036
申请日:2021-09-09
Applicant: LabLightAR, Inc.
Inventor: Roger Brent , Jamie Douglas Tremaine
CPC classification number: G06K19/06046 , G06T7/73 , G06T19/006 , G09G5/397
Abstract: An augmented reality system for procedural guidance identifies a fiducial marker object in a frame of a first field of view generated by a camera, determines a pose of the fiducial marker object, applies the fiducial marker pose to generate a first transformation between a first coordinate system of the fiducial marker object and a second coordinate system of the camera, and applies a pose of a headset to determine a second transformation between the first coordinate system and a third coordinate system of the headset.
-
公开(公告)号:US20210406625A1
公开(公告)日:2021-12-30
申请号:US17471036
申请日:2021-09-09
Applicant: LabLightAR, Inc.
Inventor: Roger Brent , Jamie Douglas Tremaine
Abstract: An augmented reality system for procedural guidance identifies a fiducial marker object in a frame of a first field of view generated by a camera, determines a pose of the fiducial marker object, applies the fiducial marker pose to generate a first transformation between a first coordinate system of the fiducial marker object and a second coordinate system of the camera, and applies a pose of a headset to determine a second transformation between the first coordinate system and a third coordinate system of the headset.
-
公开(公告)号:US11756272B2
公开(公告)日:2023-09-12
申请号:US17823015
申请日:2022-08-29
Applicant: LabLightAR, Inc.
Inventor: Roger Brent , John Max Kellum , James Ashley
CPC classification number: G06T19/006 , G06F3/011 , G06F9/453 , G06T19/20 , G09B19/003 , G06T2219/2012 , G06T2219/2016
Abstract: A method of operating a procedural training user interface system involves displaying an interactive guided process of a first user using at least one augmented reality (AR) layer through an AR device worn by a second user, where a representation of first user hands is displayed. The second user interactions may be detected during the interactive guided process, where the second user attempts to superimpose second user hands on the representation of the first user hands in the at least one AR layer. The interactive guided process of the second user may then be displayed using the AR layer through an AR device on the first user and the AR device on the second user. If the first user hands and the second user hands are not superimposed in the AR layer, the first user or the second user may be notified to take corrective action.
-
-
-
-