Abstract:
A computer-implemented method for use in conjunction with a computing device with a touch screen display comprises: detecting one or more finger contacts with the touch screen display, applying one or more heuristics to the one or more finger contacts to determine a command for the device, and processing the command. The one or more heuristics comprise: a heuristic for determining that the one or more finger contacts correspond to a one-dimensional vertical screen scrolling command, a heuristic for determining that the one or more finger contacts correspond to a two-dimensional screen translation command, and a heuristic for determining that the one or more finger contacts correspond to a command to transition from displaying a respective item in a set of items to displaying a next item in the set of items.
Abstract:
Some embodiments of the invention provide a novel prediction engine that (1) can formulate predictions about current or future destinations and/or routes to such destinations for a user, and (2) can relay information to the user about these predictions. In some embodiments, this engine includes a machine-learning engine that facilitates the formulation of predicted future destinations and/or future routes to destinations based on stored, user-specific data. The user-specific data is different in different embodiments. In some embodiments, the stored, user-specific data includes data about any combination of the following: (1) previous destinations traveled to by the user, (2) previous routes taken by the user, (3) locations of calendared events in the user's calendar, (4) locations of events for which the user has electronic tickets, and (5) addresses parsed from recent e-mails and/or messages sent to the user. In some embodiments, the prediction engine only relies on user-specific data stored on the device on which this engine executes. Alternatively, in other embodiments, it relies only on user-specific data stored outside of the device by external devices/servers. In still other embodiments, the prediction engine relies on user-specific data stored both by the device and by other devices/servers.
Abstract:
A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display corresponds to a predefined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gesture.
Abstract:
A multifunction device displays a navigation user interface that includes: a navigation bar having a plurality of unit regions and a plurality of subunit regions. Each of the unit regions represents a range of values. Each subunit region represents a subset of a respective range of values. The navigation user interface also includes a content area for displaying content associated with subunit regions. In response to detecting an input that selects a respective subunit region, the multifunction device updates the content area in accordance with the respective selected subunit region. In response to detecting an input that selects a respective unit region, the multifunction device updates the navigation bar to include subunit regions in accordance with the selected unit region and updates the content area in accordance with at least one of the subunit regions in the updated navigation bar.
Abstract:
Some embodiments provide a mapping application that provides routing information to third-party applications on a device. The mapping application receives route data that includes first and second locations. Based on the route data, the mapping application provides a set of routing applications that provide navigation information. The mapping application receives a selection of a routing application in the set of routing applications. The mapping application passes the route data to the selected routing application in order for the routing application to provide navigation information.
Abstract:
A mapping program for execution by at least one processing unit of a device is described. The device includes a touch-sensitive screen and a touch input interface. The program renders and displays a presentation of a map from a particular view of the map. The program generates an instruction to rotate the displayed map in response to a multi-touch input from the multi-touch input interface. In order to generate a rotating presentation of the map, the program changes the particular view while receiving the multi-touch input and for a duration of time after the multi-touch input has terminated in order to provide a degree of inertia motion for the rotating presentation of the map.
Abstract:
A device with a touch-sensitive display may be unlocked via gestures performed on the touch-sensitive display. The device is unlocked if contact with the display corresponds to a predefined gesture for unlocking the device. The device displays one or more unlock images with respect to which the predefined gesture is to be performed in order to unlock the device. The performance of the predefined gesture with respect to the unlock image may include moving the unlock image to a predefined location and/or moving the unlock image along a predefined path. The device may also display visual cues of the predefined gesture on the touch screen to remind a user of the gesture.