Abstract:
The present disclosure relates to user interfaces for receiving user input. In some examples, a device determines which user input technique a user has accessed most recently, and displays the corresponding user interface. In some examples, a device scrolls through a set of information on the display. When a threshold criteria is satisfied, the device displays an index object fully or partially overlaying the set of information. In some examples, a device displays an emoji graphical object, which is visually manipulated based on user input. The emoji graphical object is transmitted to a recipient. In some examples, a device displays paging affordances that enlarge and allow a user to select a particular page of a user interface. In some examples, the device displays user interfaces for various input methods, including multiple emoji graphical objects. In some examples, a keyboard is displays for receiving user input.
Abstract:
The present disclosure relates to electronic message user interfaces. A device, including a display, a touch-sensitive surface, and a rotatable input mechanism, is described in relation to accessing, composing, and manipulating electronic messages. In some examples, a user can provide input through the rotatable input mechanism to access a landing screen of an electronic messages application. The landing screen concurrently displays an affordance for accessing an electronic messages inbox and an affordance for accessing an interface for composing electronic message.
Abstract:
The present disclosure relates to aggregating and sharing wellness data. The wellness data can be received by a user device from any number of sensors external or internal to the user device, from a user manually entering the wellness data, or from other users or entities. The user device can securely store the wellness data on the user device and transmit the wellness data to be stored on a remote database. A user of the device can share some or all of the wellness data with friends, relatives, caregivers, healthcare providers, or the like. The user device can further display a user's wellness data in an aggregated view of different types of wellness data. Wellness data of other users can also be viewed if authorizations from those users have been received.
Abstract:
The present disclosure relates to aggregating and sharing wellness data. The wellness data can be received by a user device from any number of sensors external or internal to the user device, from a user manually entering the wellness data, or from other users or entities. The user device can securely store the wellness data on the user device and transmit the wellness data to be stored on a remote database. A user of the device can share some or all of the wellness data with friends, relatives, caregivers, healthcare providers, or the like. The user device can further display a user's wellness data in an aggregated view of different types of wellness data. Wellness data of other users can also be viewed if authorizations from those users have been received.
Abstract:
In some embodiments, a portable multifunction device with a touch screen display performs a method that includes: displaying a phone call user interface on the touch screen display, wherein the phone call user interface includes: a first informational item associated with an active phone call between a user of the device and a first party, a second informational item associated with a suspended phone call between the user and a second party, and a merge call icon; upon detecting a user selection of the merge call icon, merging the active phone call and the suspended phone call into a conference call between the user, the first party, and the second party, and replacing the phone call user interface with a conference call user interface. The conference call user interface includes: a third informational item associated with the conference call, and a conference call management icon.
Abstract:
A computer-implemented method, for use in conjunction with a portable electronic device with a touch screen display, comprises displaying at least a portion of a structured electronic document on the touch screen display, wherein the structured electronic document comprises a plurality of boxes of content, and detecting a first gesture at a location on the displayed portion of the structured electronic document. A first box in the plurality of boxes at the location of the first gesture is determined. The first box on the touch screen display is enlarged and substantially centered.
Abstract:
An electronic device with a display displays a first user interface; detects a first input that includes a first movement. In response to detecting the first input, the device slides the first user interface off in a first direction in accordance with the first movement, where a magnitude of the sliding of the first user interface is determined based on a magnitude of the first movement and a first movement proportionality factor; and concurrently slides the second user interface on in the first direction over the first user interface in accordance with the first movement while sliding the first user interface off the display. A magnitude of the sliding of the second user interface over the first user interface is determined based on a magnitude of the first movement and a second movement proportionality factor that is different from the first movement proportionality factor.
Abstract:
In some embodiments, a computer system changes a visual prominence of a respective virtual object in response to detecting a threshold amount of overlap between a first virtual object and a second virtual object. In some embodiments, a computer system changes a visual prominence of a respective virtual object based on a change in spatial location of a first virtual object with respect to a second virtual object. In some embodiments, a computer system applies visual effects to representations of physical objects, virtual environments, and/or physical environments. In some embodiments, a computer system changes a visual prominence of a virtual object relative to a three-dimensional environment based on display of overlapping objects of different types in the three-dimensional environment. In some embodiments, a computer system changes a level of opacity of a first virtual object overlapping a second virtual object in response to movement of the first virtual object.
Abstract:
In some embodiments, a computer system facilitates the movement and/or placement of first virtual objects with respect to second virtual objects in a three-dimensional environment in accordance with some embodiments. In some embodiments, the computer system utilizes different movement algorithms for moving objects in a three-dimensional environment in accordance with some embodiments.
Abstract:
In some embodiments, a computer system changes visual appearance of visual representations of participants moving within a simulated threshold distance of a user of the computer system. In some embodiments, a computer system arranges representations of users according to templates. In some embodiments, a computer system arranges representations of users based on shared content. In some embodiments, a computer system changes a spatial arrangement of participants in accordance with a quantity of participants that are a first type of participant. In some embodiments, a computer system changes a spatial arrangement of elements of a real-time communication session to join a group of participants. In some embodiments, a computer system facilitates interaction with groups of spatial representations of participants of a communication session. In some embodiments, a computer system facilitates updates of a spatial arrangement of participants based on a spatial distribution of the participants.