Abstract:
An image capturing apparatus: calculates, for each of pairs of images, based on the pairs of images and camera parameters, position information including information specifying a three-dimensional position of a measurement point with respect to a pair of pixels respectively included in the pair of images, and information specifying a position of each of the pair of pixels; performs, based on the position information and the camera parameters, weighted addition on the information specifying the three-dimensional position of the measurement point in each of the pair of images, with information using an effective baseline length or a viewing angle, with respect to the three-dimensional position, of a pair of the cameras corresponding to the pair of images; and outputs position information of the measurement point to which a weighted addition value is applied.
Abstract:
A camera calibration method, which calculates camera parameters of two cameras using calibration points, includes: (a1) acquiring three-dimensional coordinate sets of the calibration points and image coordinate pairs of the calibration points in a camera image of each camera; (a2) acquiring multiple camera parameters of each camera; (a3) for each calibration point, calculating a view angle-corresponding length corresponding to a view angle of the two cameras viewing the calibration point; (a4) for each calibration point, calculating a three-dimensional position of a measurement point corresponding to a three-dimensional position of the calibration point using parallax of the calibration point between the two cameras; (a5) for each calibration point, weighting a difference between the three-dimensional coordinate set of the calibration point and the three-dimensional position of the measurement point corresponding to the calibration point using the view angle-corresponding length corresponding to the calibration point; and (a6) updating the camera parameters based on the weighted difference.
Abstract:
A camera-parameter-set calculation apparatus includes a three-dimensional point group calculator that calculates a plurality of three-dimensional coordinates, based on first and second images respectively captured by first and second cameras and first and second camera parameter sets of the first and second cameras; an evaluation value calculator that determines a plurality of pixel coordinates in the second image, based on the plurality of three-dimensional coordinates and the second camera parameter set, determines a plurality of third pixel coordinates in a third image captured by a third camera, based on the plurality of three-dimensional coordinates and a third camera parameter set of the third camera, and calculates an evaluation value, based on pixel values at the plurality of second and third pixel coordinates in the second and third images; and a camera-parameter-set determiner that determines a fourth camera parameter set for the third camera, based on the evaluation value.
Abstract:
An imaging apparatus includes an imaging optical system that forms an optical signal, an imaging device that includes a plurality of pixels and that converts the optical signal formed on the plurality of pixels into an electrical signal, a color filter that is arranged between the imaging optical system and the imaging device and that has a different optical transmittance for each of the plurality of pixels and each of a plurality of wavelength ranges, and a transmission data compression circuit that compresses the electrical signal obtained by the imaging device. The sum of products of an optical transmittance group relating to a plurality of optical transmittances of the color filter for each of the plurality of pixels in the plurality of wavelength ranges and coefficients common to the plurality of pixels is the same between the plurality of pixels.
Abstract:
An imaging apparatus includes an optical imaging system that converges light from an object; an imaging device that includes a plurality of pixels, receives the converged light, and converts the received light to an electric signal; a filter unit that is disposed between the optical imaging system and the imaging device and includes a plurality of color filters having different light transmission rate characteristics; and a transmission data compressing circuit that codes the electric signal. An overall light transmission rate characteristic of the filter unit differs randomly in different pixels of the imaging device, and the transmission data compressing circuit weights and codes the electric signal of each of the pixels by using a reciprocal of a proportion of the overall light transmission rate characteristic of the filter unit corresponding to each of the plurality of pixels of the imaging device relative to a wavelength characteristic common among the pixels.
Abstract:
In an imaging device, a difference calculation unit calculates a differential signal between charge signals that have been accumulated and are held by first and charge holding units with different timings. A multiple sampling unit performs multiple sampling processing on the differential signal, and an analog digital conversion unit converts a signal that has undergone multiple sampling processing to a digital signal. That is, multiple sampling processing is performed on a differential signal with a higher sparsity than that of an image signal.
Abstract:
In an imaging device, a multiple sampling unit performs multiple sampling processing on a charge signal of a captured image, and an analog digital conversion unit converts a signal which has undergone multiple sampling processing to a digital signal. In a reconstruction device, an image reconstruction unit performs reconstruction processing on the digital signal transmitted from the imaging device using information regarding multiple sampling processing transmitted from the imaging device, and obtains an image signal.
Abstract:
A third model training part trains a second neural network model by backpropagation using an error difference between: an identification result which a third neural network model including a trained first neural network model and the second neural network connected to each other outputs after receiving second sensing data and a first operation parameter, and correct identification information corresponding to the second sensing data. A second operation parameter acquisition part acquires a second operation parameter by updating the first operation parameter via the first neural network model by the backpropagation.
Abstract:
An image generating apparatus generates an image to be displayed on a display and includes at least one memory and a control circuit. The control circuit acquires a plurality of camera images captured by a plurality of cameras installed in a vehicle, calculates a distance between one of the cameras and a target to be projected in in the camera images, detects a position of a light-transmissive object or a reflective object in the camera images, and generates an image from a point of view that is different from points of view of the plurality of camera images by using the plurality of camera images and the distance, the generated image including a predetermined image that is displayed at the position of the light-transmissive object or the reflective object.
Abstract:
A depth acquisition device includes a memory and a processor. The processor performs: acquiring timing information indicating a timing at which a light source irradiates a subject with infrared light; acquiring, from the memory, an infrared light image generated by imaging a scene including the subject with the infrared light according to the timing indicated by the timing information; acquiring, from the memory, a visible light image generated by imaging a substantially same scene as the scene of the infrared light image, with visible light from a substantially same viewpoint as a viewpoint of imaging the infrared light image at a substantially same time as a time of imaging the infrared light image; detecting a flare region from the infrared light image; and estimating a depth of the flare region based on the infrared light image, the visible light image, and the flare region.