Abstract:
Wireless power transfer for integrated cycle drive systems is described. A cycle power system includes a rim that is connected to, and positioned concentrically with, a sealed housing that can rotate about an axis. The cycle power system also includes an integrated drive system disposed within the housing. The integrated drive system includes a battery and a motor for driving a cycle by causing rotational movement of the rim about the axis. Additionally, the cycle power system includes an inductive structure that is disposed within the housing, and that wirelessly charges the battery through induction between the inductive structure and remote a charging station.
Abstract:
Systems and methods are provided for alternating projection of content and capturing an image. The method includes steps of projecting, by a projection device, content at a first rate, projecting, by the projection device, a capture frame at a second rate, and capturing, by an image capture device, an image including the capture frame at the second rate, wherein capturing the image comprises capturing the image when the capture frame is projected. Systems and methods provided herein may provide increased tracking accuracy by a projecting a capture frame that does not obscure features in the image.
Abstract:
Method, computer program product, and apparatus for providing interactions of tangible and augmented reality objects are disclosed. In one embodiment, a method of controlling a real object using a device having a camera comprises receiving a selection of at least one object, tracking the at least one object in a plurality of images captured by the camera, and causing control signals to be transmitted from the device to the real object via a machine interface based at least in part on the tracking.
Abstract:
Techniques and apparatus are described for obtaining user input via a stylus configured to serve as an interface for providing user input into a computing device. The computing device may obtain rotation-related information indicative of rotational position or rotational movement of the stylus about a longitudinal axis of the stylus. The computing device may identify an operation in response to the rotation-related information, and perform the identified operation.
Abstract:
A master device images an object device and uses the image to identify the object device. The master device then automatically interfaces with the identified object device, for example, by pairing with the object device. The master device interfaces with a second object device and initiates an interface between the first object device and the second object device. The master device may receive broadcast data from the object device including information about the visual appearance of the object device and use the broadcast data in the identification of the object device. The master device may retrieve data related to the object device and display the related data, which may be display the data over the displayed image of the object device. The master device may provide an interface to control the object device or be used to pass data to the object device.
Abstract:
Methods and systems according to one or more embodiments are provided for enhancing interactive inputs. In an embodiment, a method includes concurrently capturing touch input data on a screen of a user device and non-touch gesture input data off the screen of the user device. The method also includes determining an input command based at least in part on a combination of the concurrently captured touch input data and the non-touch gesture input data. And the method further includes affecting an operation of the user device based on the determined input command.
Abstract:
An augmented reality device provides a virtual mask that surrounds the viewer and includes a variation that provides information about the direction to a target item. The variation, which may be a variation in transparency, color, geometric shape, texture, material, lighting, or shading, is associated with the position of the target item so that orientation of the variation in the virtual mask does not change with respect to the direction of the target item. A portion of the virtual mask that is in the direction that the viewer is facing is displayed over the real-world image with the variation in the virtual mask providing information to the viewer about the direction of the target item. When the viewer rotates with respect to the target item, a different portion of the virtual mask that is in the current field of view is displayed.
Abstract:
In an embodiment, a user equipment (UE) groups a plurality of images. The UE displays a first image among the plurality of images, determines an object of interest within the first image and a desired level of zoom, and determines to lock onto the object of interest in association with one or more transitions between the plurality of images. The UE determines to transition to a second image among the plurality of images, and detects, based on the lock determination, the object of interest within the second image. The UE displays the second image by zooming-in upon the object of interest at a level of zoom that corresponds to the desired level of zoom.
Abstract:
Aspects of the disclosure are related to a method for determining a touch pressure level on a touchscreen, comprising: detecting a touch event by the touchscreen; obtaining data relating to features associated with the touch event comprising a capacitance value, a touch area, and/or a touch duration; and determining a touch pressure level based on one or more of the features.
Abstract:
Methods, devices, and computer program products for using touch orientation to distinguish between users are disclosed herein. In one aspect, a method of identifying a user of a touch device from a plurality of users of the touch device is described. The method includes receiving touch data from a touch panel of the touch device, the touch data indicating a user's touch on the touch screen. The method further includes determining an orientation of the user's touch based on the received touch data. Finally, the method includes identifying the user of the plurality of users which touched the device, based at least in part on the orientation of the touch.