-
公开(公告)号:US20240402821A1
公开(公告)日:2024-12-05
申请号:US18375280
申请日:2023-09-29
Applicant: Apple Inc.
Inventor: David J. Meyer , Julian K. Shutzberg , David M. Teitelbaum , Daniel J. Brewer , Bharat C. Dandu , Christopher D. McKenzie
IPC: G06F3/01
Abstract: Various implementations disclosed herein include devices, systems, and methods that interpret user activity as user interactions with user interface (UI) elements positioned within a three-dimensional (3D) space such as an extended reality (XR) environment. Some implementations enable user interactions with virtual elements displayed in 3D environments that utilize alternative input modalities, e.g., XR environments that interpret user activity as either direct interactions or indirect interactions with virtual elements.
-
公开(公告)号:US20250110569A1
公开(公告)日:2025-04-03
申请号:US18886891
申请日:2024-09-16
Applicant: Apple Inc.
Inventor: Mark A. Ebbole , Bharat C. Dandu , Daniel J. Brewer
Abstract: While a view of an environment is visible via a display generation component, a computer system displays a user interface including one or more user interface objects. The computer system detects, via one or more input devices, one or more inputs; and, if the one or more inputs include a first input performed using a first input manipulator and a second input performed using a second input manipulator distinct from the first input manipulator, where the first input and the second input meet concurrency criteria, the computer system: provides a first input event for the first input to a first application and a second input event for the second input to the first application. The first input includes information identifying a target location and the second input event also includes information identifying the target location.
-
公开(公告)号:US20240402800A1
公开(公告)日:2024-12-05
申请号:US18676786
申请日:2024-05-29
Applicant: Apple Inc.
Inventor: Julian K. Shutzberg , David J. Meyer , David M. Teitelbaum , Mehmet N. Agaoglu , Ian R. Fasel , Chase B. Lortie , Daniel J. Brewer , Tim H. Cornelissen , Leah M. Gum , Alexander G. Berardino , Lorenzo Soto Doblado , Vinay Chawda , Itay Bar Yosef , Dror Irony , Eslam A. Mostafa , Guy Engelhard , Paul A. Lacey , Ashwin Kumar Asoka Kumar Shenoi , Bhavin Vinodkumar Nayak , Liuhao Ge , Lucas Soffer , Victor Belyaev , Bharat C. Dandu , Matthias M. Schroeder , Yirong Tang
IPC: G06F3/01 , G06F3/04815
Abstract: Various implementations disclosed herein include devices, systems, and methods that interpret user activity as user interactions with user interface (UI) elements positioned within a three-dimensional (3D) space such as an extended reality (XR) environment. Some implementations enable user interactions with virtual elements displayed in 3D environments that utilize alternative input modalities, e.g., XR environments that interpret user activity as either direct interactions or indirect interactions with virtual elements.
-
公开(公告)号:US20240402825A1
公开(公告)日:2024-12-05
申请号:US18675723
申请日:2024-05-28
Applicant: Apple Inc.
Inventor: Daniel J. Brewer , Bharat C. Dandu , David J. Meyer , Julian K. Shutzberg , Lucas Soffer , Yirong Tang
IPC: G06F3/01
Abstract: Processing gesture input includes obtaining hand tracking data for a first hand based on one or more camera frames, detecting a first input gesture by the first hand based on the hand tracking data, and determining whether the first hand is in an active state. An input action associated with the first gesture is initiated in accordance with a determination that the first hand is in the active state. If, while the hand is in an active state, a determination is made that the inactive criterion is satisfied, then the first hand is transitioned to an inactive state.
-
公开(公告)号:US20240385692A1
公开(公告)日:2024-11-21
申请号:US18641711
申请日:2024-04-22
Applicant: Apple Inc.
Inventor: Julian K. Shutzberg , Bharat C. Dandu , Daniel J. Brewer
IPC: G06F3/01 , G06F3/04815 , G06F3/04845
Abstract: Devices, systems, and methods that interpret user activity, such as two hand gestures, as user interactions with virtual elements positioned within a three-dimensional (3D) space. For example, an example process may include receiving data corresponding to user activity involving two hands in a 3D coordinate system. The process may further include identifying actions performed by the two hands based on the data corresponding to the user activity, each of the two hands performing one of the identified actions. The process may further include determining whether the identified actions satisfy a criterion for a gesture type based on the data corresponding to the user activity. The process may further include interpreting the identified actions based on a reference element corresponding to the gesture type, wherein different gesture types correspond to different reference elements in accordance with determining that the identified actions satisfy the criterion for the gesture type.
-
-
-
-