Abstract:
Provided is an information processing device to be connected to an odor presentation device that presents an odor to a user. The information processing device acquires spatial odor information indicative of an odor in a reproduction target space, and instructs, by using the acquired spatial odor information, the odor presentation device to present to the user an odor assumably perceivable in the reproduction target space.
Abstract:
An information processing apparatus connected to a first device fixed to a left hand of a user and a second device fixed to a right hand of the user: detects positions of the first device and the second device; and calculates information associated with sizes of the hands of the user on the basis of the difference between the detected positions.
Abstract:
A head-mounted display device (30) is provided which includes: optical elements (33L, R) disposed in front of the eyes of a user and configured to introduce light associated with images to be presented to the user into the eyes of the user; and a drive mechanism configured to move the optical elements in directions intersecting with visual line directions of the user. An image display system is further provided which includes the head-mounted display device (30) and an image supply device (10) including: an image supply part configured to supply the images to the head-mounted display device; and a display position control part configured to output control instructions to operate the drive mechanism in a manner changing the positions at which the images are displayed within fields of view of the user.
Abstract:
Disclosed herein is a scent-emanating apparatus including a discharging mechanism which permits a scent substance perceivable by a user together with a marker substance for detection to be discharged outwards, and a sensor to detect the marker substance discharged from the discharging mechanism.
Abstract:
An information processing apparatus is connected to a device fixed to a hand of a user, the device including a finger position sensor configured to detect a direction of a fingertip regarding one of an index finger, a middle finger, a ring finger, and a little finger of the hand of the user. The information processing apparatus acquires, on the basis of the direction of the fingertip regarding the one of the index finger, the middle finger, the ring finger, and the little finger detected by the finger position sensor and received from the device, information regarding an angle between the finger and a finger adjacent to the finger and estimates, on the basis of a result of the acquisition, an angle between respective fingers of the index finger, the middle finger, the ring finger, and the little finger of the hand of the user.
Abstract:
A display portion 318 includes a plurality of display surfaces corresponding to a plurality of pixels within an image as a target of display, and the display surfaces are configured in such a way that positions thereof in a direction vertical to the display surfaces are made changeable. A convex lens 312 presents a virtual image of an image displayed on the display portion 318 to a field of vision of a user. A control portion 10 adjusts the positions of the plurality of display surfaces based on depth information on an object contained in the image as the target of the display, thereby adjusting a position of the virtual image presented by the convex lens 312 in units of a pixel.
Abstract:
The present invention provides an information processing device capable of properly controlling a smell presentation device by matching a using state. The information processing device acquires information on the using state of the smell presentation device that presents a smell to a user, and causes the smell presentation device to present a smell in a mode determined in accordance with the acquired information on the using state.
Abstract:
A robot 150 is placed on a touch panel. When touch sensor action sections 156a to 156c provided on a bottom face 150b of the robot act on a touch pad, an information processing device shows a face image in a rectangular area 16 of a display screen 14 of the touch panel. The rectangular area 16 corresponds to the bottom face of the robot. The face image is grabbed from the bottom face 150b of the robot 150 and appears in a face section 152. Further, the brightness of pixels in areas 18a and 18b of the display screen 14 is changed. The areas 18a and 18b correspond to optical sensors 160a and 160b provided on the bottom face 150b of the robot. The robot 150 detects the change in question, converting the change into a motion of an internal actuator, sound output, light emission, and so on in accordance with predetermined rules.