Abstract:
An image processing apparatus includes a calculation section configured to calculate filtering coefficients of a filter with a first area in an image that is partitioned into multiple first areas including the first area, the image being partitioned differently into multiple second areas, each one of which being covered by several first areas, to calculate a convoluted image of a second area using the filtering coefficients calculated with the first areas covering a part of the second area, the calculation being executed for the several first areas covering distinct parts of the second area, respectively; and an interpolation section configured to interpolate a pixel in the second area using pixels at the same position in the convoluted images of the second area which are convoluted with the respective filtering coefficients.
Abstract:
A gaze detection apparatus includes a light source which illuminates a user's eye, an imaging unit which generates an image by capturing the image of a user's face, and a processor adapted to: estimate a distance between the imaging unit and the user's face; calculate the ratio of the distance between the imaging unit and the user's face to a separation distance between the imaging unit and the light source and determine, based on the ratio, whether a bright pupil phenomenon occurs; detect from the captured image a corneal reflection image of the light source and a center of the user's pupil when determining that the bright pupil phenomenon does not occur; and detect the user's gaze direction or gaze position based on a positional relationship between the center of the user's pupil and the corneal reflection image when determining that the bright pupil phenomenon does not occur.
Abstract:
An imaging unit has sensitivity to visible light and infrared light and captures an image. When the distribution of the color of each pixel in the image captured by the imaging unit is calculated, a deriving unit derives a predetermined feature amount indicating the range of the color distribution. An estimating unit estimates a lighting environment during imaging based on the feature amount derived by the deriving unit.
Abstract:
A determining unit determines which one of a plurality of divided areas in color distribution a value of an input image single falls within. The plurality of divided areas are defined for each of R, G, and B color components, and the determining unit determines, for each color component, which one of the plurality of divided areas the value of the input image signal falls within. An image correcting unit reads correction coefficients corresponding to the determined divided area for each color component from a correction coefficient storing unit in which correction coefficients are registered for each of the color components and each of the divided areas. The image correcting unit calculates each of R, G, and B output values of an output image signal using the correction coefficients of the individual color components read from the correction coefficient storing unit.
Abstract:
A memory unit stores as conversion information which is derived based on a correction coefficient which provides, in a predetermined allowable range from a minimum value, a sum of a norm of a difference between a color difference of a correction target color and a target color which is a target of correction of the correction target color and a correction color obtained by correcting the correction target color using the correction coefficient, and a norm of the correction coefficient to which a predetermined weight coefficient is applied. A color correcting unit converts a color of each pixel of an image captured by an image capturing unit based on the conversion information.
Abstract:
An image processing apparatus includes: a processor coupled to a memory, configured to: perform an analysis of resolution in at least two directions of an image which is taken from a subject having a radial pattern, and determine filter data containing an adjusted weight coefficient which is obtained by adjusting a weight coefficient in one of the two directions, whichever has lower resolution, based on a result of the analysis of the image corrected by filtering on the image in accordance with a blurring function of the image.
Abstract:
An image processing apparatus includes an acquisition unit configured to acquire a first finite spatial filter having image resolution anisotropy, and a calculation unit configured to compute a second spatial filter by convolving a finite filter with respect to the first spatial filter, the finite filter having a sum of elements being 0 and at least two of the elements being non-0.
Abstract:
An information processing method includes calculating a second spatial filter having a size of the number of elements larger than a blur size of an image using a finite first spatial filter having an anisotropy in resolution of the image and a finite filter in which a value of a total sum of elements is zero and at least two elements have a non-zero value, and generating a plurality of spatial filters having a predetermined number of elements or less from the second spatial filter.
Abstract:
An image processing apparatus includes: a processor coupled to a memory, configured to: perform an analysis of resolution in at least two directions of an image which is taken from a subject having a radial pattern, and determine filter data containing an adjusted weight coefficient which is obtained by adjusting a weight coefficient in one of the two directions, whichever has lower resolution, based on a result of the analysis of the image corrected by filtering on the image in accordance with a blurring function of the image.
Abstract:
A determining unit determines which one of a plurality of divided areas in color distribution a value of an input image single falls within. The plurality of divided areas are defined for each of R, G, and B color components, and the determining unit determines, for each color component, which one of the plurality of divided areas the value of the input image signal falls within. An image correcting unit reads correction coefficients corresponding to the determined divided area for each color component from a correction coefficient storing unit in which correction coefficients are registered for each of the color components and each of the divided areas. The image correcting unit calculates each of R, G, and B output values of an output image signal using the correction coefficients of the individual color components read from the correction coefficient storing unit.