Abstract:
An image processing apparatus includes at least one processor configured to execute processes including: calculating a misalignment amount of each pixel of a reference image relative to a standard image; and combining the reference image converted based on the calculated amount with the standard image. The calculating includes: calculating a projection conversion matrix for each of planes with different misalignment amounts in the reference image; generating a plane map in which the plane to which each pixel of the reference image belongs and the matrix to be applied to each plane are selected based on a difference value between the standard image and each of the alignment images converted from the reference image by using each calculated matrix; suppressing a selection error of the matrix; and calculating the misalignment amount for each of the planes based on the plane map in which the selection error of the matrix is suppressed.
Abstract:
An image sensor comprising a plurality of imaging pixels and a plurality of focus detecting pixels in which opening positions of light receiving parts are shifted from those of the imaging pixels, wherein first focus detecting pixels in which the opening positions are shifted in a first direction are arranged in a first pixel pitch at positions corresponding to first color filters for the imaging pixels, and second focus detecting pixels in which openings are shifted in a second direction different from the first direction are arranged in a second pixel pitch at positions corresponding to second color filters for the imaging pixels different from the first color filters.
Abstract:
An image-processing device that includes at least one processor which is configured to: set discrete measurement regions in a standard image and a reference image selected from images acquired in a time series manner and calculate motion vectors in the respective measurement regions; set a region of interest in the standard image; select, from the calculated motion vectors, the motion vectors in the set region of interest; estimate a homographic-transformation matrix that represents a motion in the region of interest by using the selected motion vectors; evaluate an error in the homographic-transformation matrix on the basis of the estimated homographic-transformation matrix and the calculated motion vectors; and set, on the basis of the evaluation result for the homographic-transformation matrix estimated on the basis of a first region of interest, a second region of interest a size of which is increased as compared with that of the first region of interest.
Abstract:
An image pickup apparatus of the present invention includes: an image pickup optical system configured to form a subject image; an image pickup device configured to photoelectrically convert the subject image formed by the image pickup optical system, wherein part of pixels in a region provided with image formation pixels arranged in a matrix are arranged as focus detection pixels; an optical influence estimation unit configured to use output signals from the image formation pixels, output signals from the focus detection pixels, and two-dimensional pixel position information of the focus detection pixels to estimate a level of influence of vignetting caused by the image optical system according to an image height; and an optical influence correction unit configured to execute image processing with different details that vary according to an estimation result of the optical influence estimation unit.
Abstract:
An image processing device includes a shooting scene determining section that determines a shooting scene, a correlation evaluation value calculating section that calculates correlation evaluation values between a target pixel and surrounding pixels, a weight setting section that sets, when the shooting scene is a low correlation scene, heavy weights to the correlation evaluation value calculated from the surrounding pixels having high correlativities, an isolated-point degree calculating section that subjects the correlation evaluation values to weight addition to calculate an isolated-point degree, and an FPN correcting section that corrects a pixel value of the target pixel according to a magnitude of the isolated-point degree correction.
Abstract:
A corresponding region movement amount calculation unit calculates the amount of movement of each of a plurality of corresponding characteristic regions between a reference image and a base image. A clustering processing unit groups one or more characteristic regions exhibiting a substantially identical tendency in the calculated amounts of movement as belonging to a plane group located on the same plane, and classifies the plurality of characteristic regions in one or more plane groups. A projection transform matrix calculation unit calculates one or more projection transform matrices, by using the amounts of movement of the characteristic regions and the result of the grouping performed by the clustering processing unit.
Abstract:
An image-processing apparatus includes a computer configured to: detect a positional-displacement amount between low-resolution images including a standard image and at least one reference image that are acquired in a time series; generate a high-resolution combined image by positioning, based on the positional-displacement amount detected, pixel information of the reference image at a standard-image position and by performing combining thereof in a high-resolution image space; evaluate a positioning error caused by a resolution-enhancement magnification used when positioning the reference image in the high-resolution image space when generating the high-resolution combined image; and correct the high-resolution combined image based on the evaluation result, wherein the correcting of the high-resolution combined image combines a high-resolution correction image generated by applying a filter to the high-resolution combined image based on the evaluation result obtained, and the high-resolution combined image in accordance with combining ratios based on the evaluation result.
Abstract:
An imaging apparatus includes a high-frequency pattern detector calculating a difference or a ratio between a value obtained by integrating averages of pixel outputs of a first pixel group arranged in identical positions in a vertical direction with respect to a focus detection pixel or in identical positions in a vertical direction with respect to another focus detection pixel located around the focus detection pixel and a value obtained by integrating averages of pixel outputs of a second pixel group arranged in positions shifted from the first pixel group in the horizontal or vertical direction, and sets a high-frequency degree in accordance with a magnitude of the difference or the ratio. An application determination unit increases the mixing rate of the focus detection pixel when the high-frequency pattern degree is high.
Abstract:
An image processing apparatus includes a gain correcting unit and a gain estimating unit. The gain correcting unit corrects a gain of a pixel output of a phase-difference detecting pixel. The gain estimating unit estimates the gain with which to correct the pixel output of the phase-difference detecting pixel. The gain estimating unit includes a provisional true-value calculating unit configured to calculate a pixel output of a provisional true-value calculating pixel, in accordance with a correlation of the pixel outputs between a base block including the phase-difference detecting pixel and a reference block set in a search area for which a block similar to the base block is searched.
Abstract:
An image processing device includes circuitry configured to: calculate a displacement of each of a plurality of corresponding feature regions between a reference image and a base image; calculate, as a evaluation score, difference value between displacements of two feature regions adjacent to each other in at least one of the up/down direction, the left/right direction, and the oblique 45° direction; determine an abnormal region on the basis of the score; classify other feature regions excluding the abnormal feature region; calculate a projection conversion matrix by using the displacement of the other feature regions and the result of the classification; calculate a degree of alignment of each pixel of the reference image with respect to each pixel of the base image by using the matrix; and generate a combined image by combining the reference image converted based on the degree of alignment with the base image.