Abstract:
An image processing apparatus can measure a three-dimensional shape of a subject with high accuracy when an image of a projection pattern is captured with a favorable contrast. The image processing apparatus includes a matching calculation unit that performs matching calculation between a projection pattern in an image being captured and an image of a projection pattern generated by a projection pattern generation unit. An image evaluation unit calculates a contrast as an evaluation standard in the captured image. A symbol position calculation unit highly accurately calculates a position of each of a group of points forming the projection pattern, for each area in the captured image determined to be a highly accurate shape measurable area based on the contrast, and thus obtains a peak luminance position. A three-dimensional shape calculation unit calculates the three-dimensional shape of the subject.
Abstract:
A projection pattern as an image obtained by compositing the first two-dimensional pattern image and the second two-dimensional pattern image is projected onto an object, and the object onto which the projection pattern has been projected is captured. The three-dimensional shape information of the object is measured based on the projection pattern and the captured image. When a region formed from one or a plurality of pixels is set as the first unit area, and a region having a size larger than that of the first unit area is set as the second unit area, each first unit area of the first two-dimensional pattern image is assigned with one of a plurality of pixel values, and each second unit area of the second two-dimensional pattern image is assigned with one of the plurality of pixel values.
Abstract:
Contact detection units are attached to an information transfer device in a predetermined arrangement pattern to detect contact with a human body. Presentation units are attached to the information transfer device in a predetermined arrangement pattern to present a tactile stimulus. A contact detection unit that has detected contact is specified. Based on the arrangement distribution of the specified contact detection unit and a type of information to be presented via a tactile stimulus, a presentation unit to be driven is specified. The specified presentation unit is driven and controlled.
Abstract:
A robot control apparatus that controls operation of a robot that grips a target object includes an acquisition unit that obtains information about a position and orientation of the target object, a selection unit that selects one of a plurality of gripping modes based on a captured image of the target object, and a generation unit that generates, based on the obtained position and orientation and the selected gripping mode, gripping operation data that defines a gripping operation of the robot.
Abstract:
A robot control apparatus for controlling a robot manipulating a target object includes a measurement unit configured to measure a change of a gripping unit configured to grip the target object when the gripping unit contacts the target object, a first acquisition unit configured to acquire the change of the gripping unit measured by the measurement unit, a second acquisition unit configured to acquire a gripping state, which is a state of gripping of the target object by the gripping unit, based on the change of the gripping unit acquired by the first acquisition unit, and a control unit configured to control an action of the robot based on the gripping state acquired by the second acquisition unit.
Abstract:
A three dimensional measurement apparatus comprising: a projection unit configured to project a pattern including a measurement pattern and a feature pattern onto an object; an image capturing unit configured to capture an image of the object onto which the pattern is projected; a grouping unit configured to divide the captured image into a plurality of regions using the measurement pattern in the pattern and a plurality of epipolar lines determined based on a positional relationship between the projection unit and the image capturing unit, thereby grouping predetermined regions among the plurality of divided regions; a feature recognition unit configured to recognize the feature pattern based on a difference in the feature pattern between the predetermined regions; and a three-dimensional shape calculation unit configured to calculate a three-dimensional shape of the object based on the feature pattern.
Abstract:
A projection pattern that includes a measurement pattern for measuring a distance to a target object, and a code pattern for identifying the measurement pattern is projected onto the target object. The target object onto which the projection pattern was projected is sensed. On the basis of a relationship between the measurement pattern and the code pattern in the sensed image, the projection pattern that is to be projected onto the target object after the projection pattern are changed, the code pattern is read out in the sensed image of the target object onto which the changed projection pattern was projected, and the measurement pattern is associated with it. Using the associated measurement pattern, a distance from the projection unit or the sensing unit to the target object is acquired.
Abstract:
An information processing apparatus for generating a projection pattern used in three-dimensional measurement of a target object, comprising: determination means for determining a projection code string for generating the projection pattern based on distance information of the target object which is obtained in advance; and generation means for generating a pattern image of the projection pattern based on a projection code string determined by the determination means.
Abstract:
An information processing apparatus includes a projection unit configured to project a projection pattern onto an object, an imaging unit configured to capture an image of the object on which the projection pattern is projected, and a derivation unit configured to derive a three-dimensional shape of the object based on the image captured by the imaging unit. The projection pattern projected on the object by the projection unit includes a first pattern including a continuous luminance variation repetitively arranged at certain distances in a predetermined direction, and a second pattern having information for identifying the position of the measurement pattern in the captured image in an area between peaks in the measurement pattern.
Abstract:
A projection pattern that includes a measurement pattern for measuring a distance to a target object, and a code pattern for identifying the measurement pattern is projected onto the target object. The target object onto which the projection pattern was projected is sensed. On the basis of a relationship between the measurement pattern and the code pattern in the sensed image, the projection pattern that is to be projected onto the target object after the projection pattern are changed, the code pattern is read out in the sensed image of the target object onto which the changed projection pattern was projected, and the measurement pattern is associated with it. Using the associated measurement pattern, a distance from the projection unit or the sensing unit to the target object is acquired.