Abstract:
Methods and apparatus organize a plurality of haptic output variations into a cohesive semantic framework that uses various information about the alert condition and trigger, application context, and other conditions to provide a system of haptic outputs that share characteristics between related events. In some embodiments, an event class or application class provides the basis for a corresponding haptic output. In some embodiments, whether an alert-salience setting is on provides the basis for adding an increased salience haptic output to the standard haptic output for the alert. In some embodiments, consistent haptics provide for branding of the associated application class, application, and/or context.
Abstract:
An electronic device displays, on a display, a map that is associated with a first orientation. The electronic device receives a touch input on a touch-sensitive surface, and, in response to receiving the touch input on the touch-sensitive surface, rotates the map on the display in accordance with the touch input. While rotating the map on the display in accordance with the touch input, in response to determining that a displayed orientation of the rotated map corresponds to the first orientation of the map, the electronic device generates a first tactile output.
Abstract:
Methods and apparatus organize a plurality of haptic output variations into a cohesive semantic framework that uses various information about the alert condition and trigger, application context, and other conditions to provide a system of haptic outputs that share characteristics between related events. In some embodiments, an event class or application class provides the basis for a corresponding haptic output. In some embodiments, whether an alert-salience setting is on provides the basis for adding an increased salience haptic output to the standard haptic output for the alert. In some embodiments, consistent haptics provide for branding of the associated application class, application, and/or context.
Abstract:
Methods and apparatus organize a plurality of haptic output variations into a cohesive semantic framework that uses various information about the alert condition and trigger, application context, and other conditions to provide a system of haptic outputs that share characteristics between related events. In some embodiments, an event class or application class provides the basis for a corresponding haptic output. In some embodiments, whether an alert-salience setting is on provides the basis for adding an increased salience haptic output to the standard haptic output for the alert. In some embodiments, consistent haptics provide for branding of the associated application class, application, and/or context.
Abstract:
Methods and apparatus organize a plurality of haptic output variations into a cohesive semantic framework that uses various information about the alert condition and trigger, application context, and other conditions to provide a system of haptic outputs that share characteristics between related events. In some embodiments, an event class or application class provides the basis for a corresponding haptic output. In some embodiments, whether an alert-salience setting is on provides the basis for adding an increased salience haptic output to the standard haptic output for the alert. In some embodiments, consistent haptics provide for branding of the associated application class, application, and/or context.
Abstract:
The present disclosure generally relates to user interfaces and techniques for monitoring fitness activity. In accordance with some embodiments, user interfaces and techniques for transitioning between a user interface mode for measuring an activity metric to a user interface mode for measuring a different activity, based on detecting that a user characteristic has changed, are described. In accordance with some embodiments, user interfaces and techniques for measuring activity data and providing activity commentary in response to activity-based events, where different activity commentary is provided based on a characteristic of a user, are described.
Abstract:
An electronic device with a display, a touch-sensitive surface, and one or more sensors that detect intensities of contacts on the touch-sensitive surface displays, on the display, a user interface. While displaying the user interface, the electronic device detects an input that includes a contact on the touch-sensitive surface. In response to detecting the input while displaying the user interface, and while continuing to detect the input on the touch-sensitive surface: If an intensity of the contact satisfies an activation intensity threshold, the electronic device performs a first operation associated with the activation intensity threshold. The activation intensity threshold is determined based on whether or not prior inputs by the user on the touch-sensitive surface exceed a respective intensity threshold. If an intensity of the contact does not satisfy an activation intensity threshold, the electronic device forgoes performing the first operation associated with the activation intensity threshold.
Abstract:
A computer system displays a first computer-generated experience that includes displaying one or more virtual elements in a three-dimensional environment with a first appearance. While displaying the one or more virtual elements with the first appearance, the computer system receives biometric data corresponding to respiration of a first user. In response to receiving the biometric data: in accordance with a determination that the biometric data corresponding to the respiration of the first user meets first criteria, the computer system displays the one or more virtual elements in the three-dimensional environment with a second appearance that is different from the first appearance; and in accordance with a determination that the biometric data corresponding to the first user does not meet the first criteria, the computer system continues to display the one or more virtual elements in the three-dimensional environment with the first appearance.
Abstract:
A computer system displays a first view of a three-dimensional environment, including a first user interface object having a first surface at a first position in the three-dimensional environment. While displaying the first view of the three-dimensional environment, the computer system detects a change in biometric data of a first user, and in response, changes an appearance of the first surface in the first user interface object in accordance with the change in biometric data of the first user. While displaying the first user interface object with the appearance that has been changed based on the change in the biometric data of the first user, the computer system detects first movement of the first user, and in response, changes the appearance of the first user interface object in accordance with the first movement of the first user.
Abstract:
A computer system displays a first computer-generated experience that includes displaying one or more virtual elements in a three-dimensional environment with a first appearance. While displaying the one or more virtual elements with the first appearance, the computer system receives biometric data corresponding to respiration of a first user. In response to receiving the biometric data: in accordance with a determination that the biometric data corresponding to the respiration of the first user meets first criteria, the computer system displays the one or more virtual elements in the three-dimensional environment with a second appearance that is different from the first appearance; and in accordance with a determination that the biometric data corresponding to the first user does not meet the first criteria, the computer system continues to display the one or more virtual elements in the three-dimensional environment with the first appearance.