Finger-mounted input devices
    1.
    发明授权

    公开(公告)号:US12229341B2

    公开(公告)日:2025-02-18

    申请号:US16926561

    申请日:2020-07-10

    Applicant: Apple Inc.

    Abstract: A system may include an electronic device and one or more finger devices. The electronic device may have a display and the user may provide finger input to the finger device to control the display. The finger input may include pinching, tapping, rotating, swiping, pressing, and/or other finger gestures that are detected using sensors in the finger device. The sensor data related to finger movement may be combined with user gaze information to control items on the display. The user may turn any surface or region of space into an input region by first defining boundaries of the input region using the finger device. Other finger gestures such as pinching and pulling, pinching and rotating, swiping, and tapping may be used to navigate a menu on a display, to scroll through a document, to manipulate computer-aided designs, and to provide other input to a display.

    User Interface Response Based on Gaze-Holding Event Assessment

    公开(公告)号:US20240103613A1

    公开(公告)日:2024-03-28

    申请号:US18244570

    申请日:2023-09-11

    Applicant: Apple Inc.

    CPC classification number: G06F3/013 G06F3/017 G06F3/04842

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    USER INTERFACE RESPONSE BASED ON GAZE-HOLDING EVENT ASSESSMENT

    公开(公告)号:US20240393876A1

    公开(公告)日:2024-11-28

    申请号:US18790194

    申请日:2024-07-31

    Applicant: APPLE INC.

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    User interface response based on gaze-holding event assessment

    公开(公告)号:US12099653B2

    公开(公告)日:2024-09-24

    申请号:US18244570

    申请日:2023-09-11

    Applicant: Apple Inc.

    CPC classification number: G06F3/013 G06F3/017 G06F3/04842

    Abstract: Various implementations provide views of 3D environments (e.g., extended reality (XR) environments). Non-eye-based user activity, such as hand gestures, is associated with some types of eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a 3D environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time as the pinching hand gesture is made. These associated behaviors (e.g., the pinch and gaze at the button) may then be interpreted as user input, e.g., user input selecting or otherwise acting upon that user interface component. In some implementations, non-eye-based user activity is only associated with types of eye-based user activity that are likely to correspond to a user perceiving what they are seeing and/or intentionally looking at something.

    Finger-Mounted Input Devices
    6.
    发明申请

    公开(公告)号:US20210089131A1

    公开(公告)日:2021-03-25

    申请号:US16926561

    申请日:2020-07-10

    Applicant: Apple Inc.

    Abstract: A system may include an electronic device and one or more finger devices. The electronic device may have a display and the user may provide finger input to the finger device to control the display. The finger input may include pinching, tapping, rotating, swiping, pressing, and/or other finger gestures that are detected using sensors in the finger device. The sensor data related to finger movement may be combined with user gaze information to control items on the display. The user may turn any surface or region of space into an input region by first defining boundaries of the input region using the finger device. Other finger gestures such as pinching and pulling, pinching and rotating, swiping, and tapping may be used to navigate a menu on a display, to scroll through a document, to manipulate computer-aided designs, and to provide other input to a display.

    GAZE BEHAVIOR DETECTION
    7.
    发明公开

    公开(公告)号:US20230418372A1

    公开(公告)日:2023-12-28

    申请号:US18211712

    申请日:2023-06-20

    Applicant: Apple Inc.

    CPC classification number: G06F3/013

    Abstract: Various implementations disclosed herein include devices, systems, and methods that determine a gaze behavior state to identify gaze shifting events, gaze holding events, and loss events of a user based on physiological data. For example, an example process may include obtaining eye data associated with a gaze during a first period of time (e.g., eye position and velocity, interpupillary distance, pupil diameters, etc.). The process may further include obtaining head data associated with the gaze during the first period of time (e.g., head position and velocity). The process may further include determining a first gaze behavior state during the first period of time to identify gaze shifting events, gaze holding events, and loss events (e.g., one or more gaze and head pose characteristics may be determined, aggregated, and used to classify the user's eye movement state using machine learning techniques).

    Dynamic Image Stabilization Using Motion Sensors

    公开(公告)号:US20190317659A1

    公开(公告)日:2019-10-17

    申请号:US16366503

    申请日:2019-03-27

    Applicant: Apple Inc.

    Abstract: An electronic device may include a display for displaying image content to a user and dynamic image stabilization circuitry for dynamically compensating the image content if the device is moving unpredictably to help keep the image content aligned with the user's gaze. The electronic device may include sensors for detecting the displacement of the device. The dynamic image stabilization circuitry may include a usage scenario detection circuit and a content displacement compensation calculation circuit. The usage scenario detection circuit receives data from the sensors and infers a usage scenario based on the sensor data. The content displacement compensation calculation circuit uses the inferred usage scenario to compute a displacement amount by which to adjust image content. When motion stops, the image content may gradually drift back to the center of the display.

Patent Agency Ranking