Abstract:
In some implementations, a mobile device can be configured to provide simplified audio navigation instructions. The simplified audio navigation instructions can provide a reduced set of audio navigation instructions so that the audio instructions are only presented to the user when the user wishes to or needs to hear the instructions. A user can enable the simplified audio navigation instructions. The simplified audio navigation instructions can be enabled automatically. The simplified audio navigation instructions can be configured with rules for when to present audio navigation instructions. For example, the rules can specify that audio navigation instructions are to be provided for complex road segments, a user defined portion of a route, or specified road types, among other criteria. The mobile device can be configured with exceptions to the rules such that audio navigation instructions can be presented when the user has, for example, deviated from a defined route.
Abstract:
In some implementations, a mobile device can be configured to provide simplified audio navigation instructions. The simplified audio navigation instructions can provide a reduced set of audio navigation instructions so that the audio instructions are only presented to the user when the user wishes to or needs to hear the instructions. A user can enable the simplified audio navigation instructions. The simplified audio navigation instructions can be enabled automatically. The simplified audio navigation instructions can be configured with rules for when to present audio navigation instructions. For example, the rules can specify that audio navigation instructions are to be provided for complex road segments, a user defined portion of a route, or specified road types, among other criteria. The mobile device can be configured with exceptions to the rules such that audio navigation instructions can be presented when the user has, for example, deviated from a defined route.
Abstract:
Techniques for mobile devices to subscribe and share raw sensor data are provided. The raw sensor data associated with sensors (e.g., accelerometers, gyroscopes, compasses, pedometers, pressure sensors, audio sensors, light sensors, barometers) of a mobile device can be used to determine the movement or activity of a user. By sharing the raw or compressed sensor data with other computing devices, the other computing devices can determine a motion state based on the sensor data. Additionally, in some instances, the other computing devices can determine a functional state based on the sensor data and the motion state. For example, functional state classification can be associated with each motion state (e.g., driving, walking) by further describing each motion state (e.g., walking on rough terrain, driving while texting).
Abstract:
A wearable computing device can detect device-raising gestures. For example, onboard motion sensors of the device can detect movement of the device in real time and infer information about the spatial orientation of the device. Based on analysis of signals from the motion sensors, the device can detect a raise gesture, which can be a motion pattern consistent with the user moving the device's display into his line of sight. In response to detecting a raise gesture, the device can activate its display and/or other components. Detection of a raise gesture can occur in stages, and activation of different components can occur at different stages.
Abstract:
A wearable computing device can detect device-raising gestures. For example, onboard motion sensors of the device can detect movement of the device in real time and infer information about the spatial orientation of the device. Based on analysis of signals from the motion sensors, the device can detect a raise gesture, which can be a motion pattern consistent with the user moving the device's display into his line of sight. In response to detecting a raise gesture, the device can activate its display and/or other components. Detection of a raise gesture can occur in stages, and activation of different components can occur at different stages.
Abstract:
A real-time calibration system and method for a mobile device having an onboard magnetometer uses an estimator to estimate magnetometer calibration parameters and a magnetic field external to the mobile device (e.g., the earth magnetic field). The calibration parameters can be used to calibrate uncalibrated magnetometer readings output from the onboard magnetometer. The external magnetic field can be modeled as a weighted combination of a past estimate of the external magnetic field and the asymptotic mean of that magnetic field, perturbed by a random noise (e.g., Gaussian random noise). The weight can be adjusted based on a measure of the statistical uncertainty of the estimated calibration parameters and the estimated external magnetic field. The asymptotic mean of the external magnetic field can be modeled as a time average of the estimated external magnetic field.
Abstract:
Methods, program products, and systems for gesture classification and recognition are disclosed. In general, in one aspect, a system can determine multiple motion patterns for a same user action (e.g., picking up a mobile device from a table) from empirical training data. The system can collect the training data from one or more mobile devices. The training data can include multiple series of motion sensor readings for a specified gesture. Each series of motion sensor readings can correspond to a particular way a user performs the gesture. Using clustering techniques, the system can extract one or more motion patterns from the training data. The system can send the motion patterns to mobile devices as prototypes for gesture recognition.
Abstract:
Techniques for mobile devices to subscribe and share raw sensor data are provided. The raw sensor data associated with sensors (e.g., accelerometers, gyroscopes, compasses, pedometers, pressure sensors, audio sensors, light sensors, barometers) of a mobile device can be used to determine the movement or activity of a user. By sharing the raw or compressed sensor data with other computing devices, the other computing devices can determine a motion state based on the sensor data. Additionally, in some instances, the other computing devices can determine a functional state based on the sensor data and the motion state. For example, functional state classification can be associated with each motion state (e.g., driving, walking) by further describing each motion state (e.g., walking on rough terrain, driving while texting).
Abstract:
Apps may be tagged with location data when they are used. Mobile device may anonymously submit app usage data. Aggregated app usage data from many mobile devices may be analyzed to determine apps that are particularly relevant to a given location (i.e., exhibiting a high degree of localization). Analysis may include determining the app usage intensity, whether hotspots exist or not at a given location, the spatial entropy of a particular app, the device populations in a particular area, etc. Based on the localized app analysis, apps may be ranked according to local relevance, and, based on this ranking, app recommendations may be provided to a user.
Abstract:
In an example method, a mobile device obtains a signal indicating an acceleration measured by a sensor over a time period. The mobile device determines an impact experienced by the user based on the signal. The mobile device also determines, based on the signal, one or more first motion characteristics of the user during a time prior to the impact, and one or more second motion characteristics of the user during a time after the impact. The mobile device determines that the user has fallen based on the impact, the one or more first motion characteristics of the user, and the one or more second motion characteristics of the user, and in response, generates a notification indicating that the user has fallen.