Abstract:
A method for adjusting audio being outputted through a beam forming loudspeaker array. Program audio is rendered to drive the loudspeaker array to produce sound beams having i) a main content pattern that is aimed at a listener, superimposed with ii) several diffuse content patterns that are aimed away from the listener. In response to receiving an alert message that refers to alert audio, the portion of the program audio in the main pattern is moved into the diffuse patterns, and the alert audio is rendered to drive the loudspeaker array so that the portion of the program audio in the main pattern is replaced with the alert audio. Other embodiments are also described and claimed.
Abstract:
A breathing sequence may define a suggested breathing pattern. Input may be received at a user interface of a device to initiate the breathing sequence. The breathing sequence may include a configuration phase in which configuration information may be received. The configuration information may define a variable time period for the breathing sequence. The breathing sequence also may include a preliminary phase during which a first version of a fluctuating progress indicator may be presented on the user interface. The fluctuating progress indicator may include a plurality of variable visual characteristics and may fluctuate at a first cyclic rate. The breathing sequence may also include a breathing phase during which a second version of the fluctuating progress indicator may be presented. The second version of the fluctuating progress indicator may fluctuate at a second cyclic rate according to a breathing rate.
Abstract:
An electronic device with a touch-sensitive surface, a display, and tactile output generator(s) displays a user interface including a user interface element. The device detects a contact on the user interface element and an input by the contact including a movement of the contact across the touch-sensitive surface. In response to detecting the input by the contact, the device changes a position of an outer edge of the user interface element relative to a threshold position in the user interface in accordance with the movement of the contact. The device further detects that the change in the position of the outer edge of the user interface element has caused the outer edge of the user interface element to move across the threshold position. After detecting the threshold position crossing, the device generates a tactile output and moves the position of the outer edge to the threshold position.
Abstract:
An electronic device with one or more processors and memory is in communication with a display. The device, while in a first playback navigation mode, provides, to the display, video information for display; and receives an input that corresponds to a request by a user to switch to a second playback navigation mode. The video information includes information that corresponds to one or more frames of a video, a scrubber bar that represents a timeline of the video, a first playhead that indicates a current play position in the scrubber bar, and playback position markers, distinct from the first playhead, that indicate predetermined playback positions in the video. The device, in response to receiving the input, transitions from the first playback navigation mode to the second playback navigation mode; and, while in the second playback navigation mode, ceases to provide information that corresponds to the playback position markers.
Abstract:
An electronic device provides, to a display, data to present a user interface with a plurality of user interface objects that includes a first user interface object and a second user interface object. A current focus is on the first user interface object. The device receives an input that corresponds to a request to move the current focus; and, in response, provides, to the display, data to: move the first user interface object from a first position towards the second user interface object and/or tilt the first user interface object from a first orientation towards the second user interface object; and, after moving and/or tilting the first user interface object, move the current focus from the first user interface object to the second user interface object, and move the first user interface object back towards the first position and/or tilt the first user interface object back towards the first orientation.
Abstract:
An electronic device displays a user interface that includes one or more user interface elements; detects a user input on a touch-sensitive surface that includes detecting a contact at a location that corresponds to a respective user interface element; and, in response: if the user input satisfies menu-display criteria, including a criterion that is satisfied when the contact has an intensity above a first intensity threshold, displays a first menu overlaid on the user interface; and, if the user input satisfies action criteria, wherein the action criteria are capable of being satisfied when the intensity of the contact is below the first intensity threshold, initiates performance of an action associated with the respective user interface element without displaying the first menu.
Abstract:
Systems, methods, and computer-readable medium are provided for managing alerts of one or more computing devices. For example, a user device may configure a user interface to present electronic content corresponding to a first category. The user device may also receive a data structure of rules. At least one rule may correspond to an entry of the data structure for an alert category pairing. The user device may receive information that identifies and incoming alert and determine a presentation method for the incoming alert based at least in part on a corresponding rule. The user device may also present the incoming based at least in part on the determined presentation method.
Abstract:
Methods and apparatus organize a plurality of haptic output variations into a cohesive semantic framework that uses various information about the alert condition and trigger, application context, and other conditions to provide a system of haptic outputs that share characteristics between related events. In some embodiments, an event class or application class provides the basis for a corresponding haptic output. In some embodiments, whether an alert-salience setting is on provides the basis for adding an increased salience haptic output to the standard haptic output for the alert. In some embodiments, consistent haptics provide for branding of the associated application class, application, and/or context.
Abstract:
Methods and apparatus organize a plurality of haptic output variations into a cohesive semantic framework that uses various information about the alert condition and trigger, application context, and other conditions to provide a system of haptic outputs that share characteristics between related events. In some embodiments, an event class or application class provides the basis for a corresponding haptic output. In some embodiments, whether an alert-salience setting is on provides the basis for adding an increased salience haptic output to the standard haptic output for the alert. In some embodiments, consistent haptics provide for branding of the associated application class, application, and/or context.