Abstract:
In an imaging device, a multiple sampling unit performs multiple sampling processing on a charge signal of a captured image, and an analog digital conversion unit converts a signal which has undergone multiple sampling processing to a digital signal. In a reconstruction device, an image reconstruction unit performs reconstruction processing on the digital signal transmitted from the imaging device using information regarding multiple sampling processing transmitted from the imaging device, and obtains an image signal.
Abstract:
A depth acquisition device includes a memory and a processor. The processor performs: acquiring timing information indicating a timing at which a light source irradiates a subject with infrared light; acquiring, from the memory, an infrared light image generated by imaging a scene including the subject with the infrared light according to the timing indicated by the timing information; acquiring, from the memory, a visible light image generated by imaging a substantially same scene as the scene of the infrared light image, with visible light from a substantially same viewpoint as a viewpoint of imaging the infrared light image at a substantially same time as a time of imaging the infrared light image; detecting a flare region from the infrared light image; and estimating a depth of the flare region based on the infrared light image, the visible light image, and the flare region.
Abstract:
An image generating apparatus generates an image to be displayed on a display and includes at least one memory and a control circuit. The control circuit acquires a plurality of camera images captured by a plurality of cameras installed in a vehicle, calculates a distance between one of the cameras and a target to be projected in in the camera images, detects a position of a light-transmissive object or a reflective object in the camera images, and generates an image from a point of view that is different from points of view of the plurality of camera images by using the plurality of camera images and the distance, the generated image including a predetermined image that is displayed at the position of the light-transmissive object or the reflective object.
Abstract:
An imaging apparatus in an embodiment includes lens optical systems each including a lens whose surface closest to the target object is shaped to be convex toward the target object, imaging regions which respectively face the lens optical systems and output a photoelectrically converted signal corresponding to an amount of light transmitting the lens optical systems and received by the imaging regions, and a light-transmissive cover which covers an exposed portion of the lens of each of the lens optical systems and a portion between the lens of one of the lens optical systems and the lens of another one of the lens optical systems adjacent to the one of the lens optical systems, the cover having a curved portion which is convex toward the target object. The optical axes of the lens optical systems are parallel to each other.
Abstract:
A crossing point detector includes memory and a crossing point detection unit that reads out a square image from a captured image in the memory, and detects a crossing point of two boundary lines in a checker pattern depicted in the square image. The crossing point detection unit decides multiple parameters of a function model treating two-dimensional image coordinates as variables, the parameters optimizing an evaluation value based on a difference between corresponding pixel values represented by the function model and the square image, respectively, and computes the position of a crossing point of two straight lines expressed by the decided multiple parameters to thereby detect the crossing point with subpixel precision. The function model uses a curved surface that is at least first-order differentiable to express pixel values at respective positions in a two-dimensional coordinate system at the boundary between black and white regions.
Abstract:
An image capturing apparatus includes a first camera that captures a first image, a second camera that captures a second image, a lens cover that includes transparent parts and ridgelines and that covers the first camera and the second camera, and a processing circuit that identifies a pixel located in an area, in which it is necessary to interpolate a pixel value, in the first image, and generates an output image using the first image and interpolation pixel information for interpolating a pixel value of the identified pixel. Each ridgeline between adjacent parts of the lens cover is twisted with respect to a base line extending between a center of a first lens of the first camera and a center of a second lens of the second camera. An upper part of the lens cover opposes a base on which the first camera and the second camera are disposed.
Abstract:
A camera parameter set calculation method includes acquiring a first image from a first camera, a second image from a second camera, first and second camera parameter sets of the first camera and the second camera, calculating three-dimensional coordinate sets on the basis of the first image, the second image, the first camera parameter set, and the second camera parameter set, determining first pixel coordinate pairs obtained by projecting the three-dimensional coordinate sets onto the first image and second pixel coordinate pairs obtained by projecting the three-dimensional coordinate sets onto the second image, calculating an evaluation value on the basis of the pixel values at the first pixel coordinate pairs and the pixel values at the second pixel coordinate pairs, and updating the first camera parameter set and the second camera parameter set on a basis of the evaluation value.
Abstract:
A crossing point detector includes memory and a crossing point detection unit that reads out a square image from a captured image in the memory, and detects a crossing point of two boundary lines in a checker pattern depicted in the square image. The crossing point detection unit decides multiple parameters of a function model treating two-dimensional image coordinates as variables, the parameters optimizing an evaluation value based on a difference between corresponding pixel values represented by the function model and the square image, respectively, and computes the position of a crossing point of two straight lines expressed by the decided multiple parameters to thereby detect the crossing point with subpixel precision. The function model uses a curved surface that is at least first-order differentiable to express pixel values at respective positions in a two-dimensional coordinate system at the boundary between black and white regions.
Abstract:
An imaging apparatus includes an image-forming optical system that forms an image by using optical signals; an imaging device that includes a plurality of pixels, receives, with the plurality of pixels, the optical signals used to form the image, and converts the optical signals into electric signals; and a color filter that is located between the image-forming optical system and the imaging device and has a light transmittance which differs according to positions on the color filter corresponding to the plurality of pixels and according to a plurality of wavelength bands.
Abstract:
An imaging system serving as an image generation device is provided with: a random optical filter array that has a plurality of types of optical filters and a scattering unit; photodiodes that receive light transmitted through the random optical filter array; an AD conversion unit that converts the light received by the photodiodes, into digital data; and a color image generation circuit that generates an image, using the digital data and modulation information of the random optical filter array, in which the scattering unit is located between the plurality of types of optical filters and the photodiodes, and in which the scattering unit includes a material having a first refractive index, and a material having a second refractive index that is different from the first refractive index.