Abstract:
Applications may be tagged with location data when they are used. Mobile device may anonymously submit application usage data. Aggregated application usage data from many mobile devices may be analyzed to determine applications that are particularly relevant to a given location (i.e., exhibiting a high degree of localization). Analysis may include determining the application usage intensity, whether hotspots exist or not at a given location, the spatial entropy of a particular application, the device populations in a particular area, etc. Based on the localized application analysis, applications may be ranked according to local relevance, and, based on this ranking, application recommendations may be provided to a user.
Abstract:
Apps may be tagged with location data when they are used. Mobile device may anonymously submit app usage data. Aggregated app usage data from many mobile devices may be analyzed to determine apps that are particularly relevant to a given location (i.e., exhibiting a high degree of localization). Analysis may include determining the app usage intensity, whether hotspots exist or not at a given location, the spatial entropy of a particular app, the device populations in a particular area, etc. Based on the localized app analysis, apps may be ranked according to local relevance, and, based on this ranking, app recommendations may be provided to a user.
Abstract:
Techniques for mobile devices to subscribe and share raw sensor data are provided. The raw sensor data associated with sensors (e.g., accelerometers, gyroscopes, compasses, pedometers, pressure sensors, audio sensors, light sensors, barometers) of a mobile device can be used to determine the movement or activity of a user. By sharing the raw or compressed sensor data with other computing devices, the other computing devices can determine a motion state based on the sensor data. Additionally, in some instances, the other computing devices can determine a functional state based on the sensor data and the motion state. For example, functional state classification can be associated with each motion state (e.g., driving, walking) by further describing each motion state (e.g., walking on rough terrain, driving while texting).
Abstract:
Methods and mobile devices determine an exit from a vehicle. Sensors of a mobile device can be used to determine when the user is in a vehicle that is driving. The same or different sensors can be used to identify a disturbance (e.g., loss of communication connection from mobile device to a car computer). After the disturbance, an exit confidence score can be determined at various times, and compared to a threshold. A determination of the exit of the user can be determined based on the comparison of the exit confidence score to the threshold. The mobile device can perform one or more functions in response to the exit confidence score exceeding the threshold, such as changing a user interface (e.g., of a navigation app) or obtaining a location to designate a parking location.
Abstract:
A device provides user interfaces for capturing and sending media, such as audio, video, or images, from within a message application. The device detects a movement of the device and in response, plays or records an audio message. The device detects a movement of the device and in response, sends a recorded audio message. The device removes messages from a conversation based on expiration criteria. The device shares a location with one or more message participants in a conversation.
Abstract:
In some implementations, a mobile device can be configured to provide navigation instructions to a user of the mobile device. The navigation instructions can be graphical, textual or audio instructions. The presentation of the navigation instructions can be dynamically adjusted based the importance of individual instructions and/or environmental conditions. For example, each navigation instruction can be associated with an importance value indicating how important the instruction is. The volume of important audio instructions can be adjusted (e.g., increased) to compensate for ambient noise so that a user will be more likely to hear the navigation instruction. The timing and/or repetition of the presentation of important instructions can be adjusted based on weather conditions, traffic conditions, or road conditions and/or road features so that a user will be less likely to miss an important navigation instruction.
Abstract:
In some implementations, a mobile device can be configured to provide navigation instructions to a user of the mobile device. The navigation instructions can be graphical, textual or audio instructions. The presentation of the navigation instructions can be dynamically adjusted based the importance of individual instructions and/or environmental conditions. For example, each navigation instruction can be associated with an importance value indicating how important the instruction is. The volume of important audio instructions can be adjusted (e.g., increased) to compensate for ambient noise so that a user will be more likely to hear the navigation instruction. The timing and/or repetition of the presentation of important instructions can be adjusted based on weather conditions, traffic conditions, or road conditions and/or road features so that a user will be less likely to miss an important navigation instruction.
Abstract:
The embodiments described relate to techniques and systems for utilizing a portable electronic device to monitor, process, present and manage data captured by a series of sensors and location awareness technologies to provide a context aware map and navigation application. The context aware map application offers a user interface including visual and audio input and output, and provides several map modes that can change based upon context determined by data captured by a series of sensors and location awareness technologies.
Abstract:
A mobile device may determine a set of range values between the first mobile device and the second mobile device based on a difference between the first time and the second time for ranging wireless signals. The mobile device can determine first odometry information using a first sensor on the first mobile device, the first odometry information indicating a first motion of the first mobile device during a time period. The mobile device can receive, via a data channel, second odometry information determined from second measurements captured using a second sensor on the second mobile device. The mobile device can solve an angle between a first reference frame for the first device and a second reference frame for the second device using the set of range values, and the first and second odometry information to display a directional arrow pointing to a current position of the second mobile device.
Abstract:
Embodiments are disclosed for head tracking state detection based on correlated motion of a source device and a headset communicatively coupled to the source device. In an embodiment, a method comprises: obtaining, using one or more processors of a source device, source device motion data from a source device and headset motion data from a headset; determining, using the one or more processors, correlation measures using the source device motion data and the headset motion data; updating, using the one or more processors, a motion tracking state based on the determined correlation measures; and initiating head pose tracking in accordance with the updated motion tracking state.