-
公开(公告)号:US12229341B2
公开(公告)日:2025-02-18
申请号:US16926561
申请日:2020-07-10
Applicant: Apple Inc.
Inventor: Paul X. Wang , Tim H. Cornelissen , Richard G. Huizar , Paul V. Johnson
Abstract: A system may include an electronic device and one or more finger devices. The electronic device may have a display and the user may provide finger input to the finger device to control the display. The finger input may include pinching, tapping, rotating, swiping, pressing, and/or other finger gestures that are detected using sensors in the finger device. The sensor data related to finger movement may be combined with user gaze information to control items on the display. The user may turn any surface or region of space into an input region by first defining boundaries of the input region using the finger device. Other finger gestures such as pinching and pulling, pinching and rotating, swiping, and tapping may be used to navigate a menu on a display, to scroll through a document, to manipulate computer-aided designs, and to provide other input to a display.
-
公开(公告)号:US20240402800A1
公开(公告)日:2024-12-05
申请号:US18676786
申请日:2024-05-29
Applicant: Apple Inc.
Inventor: Julian K. Shutzberg , David J. Meyer , David M. Teitelbaum , Mehmet N. Agaoglu , Ian R. Fasel , Chase B. Lortie , Daniel J. Brewer , Tim H. Cornelissen , Leah M. Gum , Alexander G. Berardino , Lorenzo Soto Doblado , Vinay Chawda , Itay Bar Yosef , Dror Irony , Eslam A. Mostafa , Guy Engelhard , Paul A. Lacey , Ashwin Kumar Asoka Kumar Shenoi , Bhavin Vinodkumar Nayak , Liuhao Ge , Lucas Soffer , Victor Belyaev , Bharat C. Dandu , Matthias M. Schroeder , Yirong Tang
IPC: G06F3/01 , G06F3/04815
Abstract: Various implementations disclosed herein include devices, systems, and methods that interpret user activity as user interactions with user interface (UI) elements positioned within a three-dimensional (3D) space such as an extended reality (XR) environment. Some implementations enable user interactions with virtual elements displayed in 3D environments that utilize alternative input modalities, e.g., XR environments that interpret user activity as either direct interactions or indirect interactions with virtual elements.
-
公开(公告)号:US20240103613A1
公开(公告)日:2024-03-28
申请号:US18244570
申请日:2023-09-11
Applicant: Apple Inc.
Inventor: Vinay Chawda , Mehmet N. Agaoglu , Leah M. Gum , Paul A. Lacey , Julian K. Shutzberg , Tim H. Cornelissen , Alexander G. Birardino
IPC: G06F3/01 , G06F3/04842
CPC classification number: G06F3/013 , G06F3/017 , G06F3/04842
Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.
-
公开(公告)号:US20240393876A1
公开(公告)日:2024-11-28
申请号:US18790194
申请日:2024-07-31
Applicant: APPLE INC.
Inventor: Vinay Chawda , Mehmet N. Agaoglu , Leah M. Gum , Paul A. Lacey , Julian K. Shutzberg , Tim H. Cornelissen , Alexander G. Berardino
IPC: G06F3/01 , G06F3/04842
Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.
-
公开(公告)号:US12099653B2
公开(公告)日:2024-09-24
申请号:US18244570
申请日:2023-09-11
Applicant: Apple Inc.
Inventor: Vinay Chawda , Mehmet N. Agaoglu , Leah M. Gum , Paul A. Lacey , Julian K. Shutzberg , Tim H. Cornelissen , Alexander G. Berardino
IPC: G06F3/01 , G06F3/04842
CPC classification number: G06F3/013 , G06F3/017 , G06F3/04842
Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.
-
公开(公告)号:US20210089131A1
公开(公告)日:2021-03-25
申请号:US16926561
申请日:2020-07-10
Applicant: Apple Inc.
Inventor: Paul X. Wang , Tim H. Cornelissen , Richard G. Huizar , Paul V. Johnson
Abstract: A system may include an electronic device and one or more finger devices. The electronic device may have a display and the user may provide finger input to the finger device to control the display. The finger input may include pinching, tapping, rotating, swiping, pressing, and/or other finger gestures that are detected using sensors in the finger device. The sensor data related to finger movement may be combined with user gaze information to control items on the display. The user may turn any surface or region of space into an input region by first defining boundaries of the input region using the finger device. Other finger gestures such as pinching and pulling, pinching and rotating, swiping, and tapping may be used to navigate a menu on a display, to scroll through a document, to manipulate computer-aided designs, and to provide other input to a display.
-
公开(公告)号:US20230418372A1
公开(公告)日:2023-12-28
申请号:US18211712
申请日:2023-06-20
Applicant: Apple Inc.
Inventor: Mehmet N. Agaoglu , Andrew B. Watson , Tim H. Cornelissen , Alexander G. Berardino
IPC: G06F3/01
CPC classification number: G06F3/013
Abstract: Various implementations disclosed herein include devices, systems, and methods that determine a gaze behavior state to identify gaze shifting events, gaze holding events, and loss events of a user based on physiological data. For example, an example process may include obtaining eye data associated with a gaze during a first period of time (e.g., eye position and velocity, interpupillary distance, pupil diameters, etc.). The process may further include obtaining head data associated with the gaze during the first period of time (e.g., head position and velocity). The process may further include determining a first gaze behavior state during the first period of time to identify gaze shifting events, gaze holding events, and loss events (e.g., one or more gaze and head pose characteristics may be determined, aggregated, and used to classify the user's eye movement state using machine learning techniques).
-
公开(公告)号:US11119624B2
公开(公告)日:2021-09-14
申请号:US16366503
申请日:2019-03-27
Applicant: Apple Inc.
Inventor: Paul V. Johnson , Ahmad Rahmati , Chaohao Wang , Cheng Chen , Graham B. Myhre , Jiaying Wu , Paolo Sacchetto , Sheng Zhang , Yunhui Hou , Xiaokai Li , Tim H. Cornelissen
IPC: G06F3/0485 , G06F3/0481 , G06F3/0488 , G06F3/0484 , G06F3/01
Abstract: An electronic device may include a display for displaying image content to a user and dynamic image stabilization circuitry for dynamically compensating the image content if the device is moving unpredictably to help keep the image content aligned with the user's gaze. The electronic device may include sensors for detecting the displacement of the device. The dynamic image stabilization circuitry may include a usage scenario detection circuit and a content displacement compensation calculation circuit. The usage scenario detection circuit receives data from the sensors and infers a usage scenario based on the sensor data. The content displacement compensation calculation circuit uses the inferred usage scenario to compute a displacement amount by which to adjust image content. When motion stops, the image content may gradually drift back to the center of the display.
-
公开(公告)号:US20190317659A1
公开(公告)日:2019-10-17
申请号:US16366503
申请日:2019-03-27
Applicant: Apple Inc.
Inventor: Paul V. Johnson , Ahmad Rahmati , Chaohao Wang , Cheng Chen , Graham B. Myhre , Jiaying Wu , Paolo Sacchetto , Sheng Zhang , Yunhui Hou , Xiaokai Li , Tim H. Cornelissen
IPC: G06F3/0484 , G06F3/01 , G06F3/0481
Abstract: An electronic device may include a display for displaying image content to a user and dynamic image stabilization circuitry for dynamically compensating the image content if the device is moving unpredictably to help keep the image content aligned with the user's gaze. The electronic device may include sensors for detecting the displacement of the device. The dynamic image stabilization circuitry may include a usage scenario detection circuit and a content displacement compensation calculation circuit. The usage scenario detection circuit receives data from the sensors and infers a usage scenario based on the sensor data. The content displacement compensation calculation circuit uses the inferred usage scenario to compute a displacement amount by which to adjust image content. When motion stops, the image content may gradually drift back to the center of the display.
-
-
-
-
-
-
-
-