Abstract:
An integrated circuit for an imaging system is disclosed. In one aspect, an integrated circuit has an array of optical sensors, an array of optical filters integrated with the sensors and configured to pass a band of wavelengths onto one or more of the sensors, and read out circuitry to read out pixel values from the sensors to represent an image. Different ones of the optical filters are configured to have a different thickness, to pass different bands of wavelengths by means of interference, and to allow detection of a spectrum of wavelengths. The read out circuitry can enable multiple pixels under one optical filter to be read out in parallel. The thicknesses may vary non-monotonically across the array. The read out, or later image processing, may involve selection or interpolation between wavelengths, to carry out spectral sampling or shifting, to compensate for thickness errors.
Abstract:
A spectral camera having an objective lens, an array of lenses for producing optical copies of segments of the image, an array of filters for the different optical channels and having an interleaved spatial pattern, and a sensor array to detect the copies of the image segments is disclosed. Further, detected segment copies of spatially adjacent optical channels have different passbands and represent overlapping segments of the image, and detected segment copies of the same passband on spatially non-adjacent optical channels represent adjacent segments of the image which fit together. Having segments of the image copied can help enable better optical quality for a given cost. Having an interleaved pattern of the filter bands with overlapping segments enables each point of the image to be sensed at different bands to obtain the spectral output for many bands simultaneously to provide better temporal resolution.
Abstract:
A spectral camera for producing a spectral output is disclosed. The spectral camera has an objective lens for producing an image, an array of mirrors, an array of filters for passing a different passband of the optical spectrum for different ones of the optical channels arranged so as to project multiple of the optical channels onto different parts of the same focal plane, and a sensor array at the focal plane to detect the filtered image copies simultaneously. By using mirrors, there may be less optical degradation and the trade off of cost with optical quality can be better. By projecting the optical channels onto different parts of the same focal plane a single sensor or coplanar multiple sensors can to be used to detect the different optical channels simultaneously which promotes simpler alignment and manufacturing.
Abstract:
A solid-state spectral imaging device is described. The device includes an image sensor and a plurality of optical filters directly processed on top of the image sensor. Each optical filter includes a first mirror and a second mirror defining an optical filter cavity having a fixed height. Each optical filter also includes a first electrode and a second electrode having a fixed position located opposite to each other and positioned to measure the height of the optical filter cavity. Further, a method to calibrate spectral data of light and a computer program for calibrating light is described.
Abstract:
A device for detecting particles in air; said device comprising:
a receiver for receiving a flow of air comprising particles; a particle capturing arrangement configured to transfer the particles from the flow of air to a liquid for collection of a set of particles in the liquid; a flow channel configured to pass a flow of the liquid comprising the set of particles through the flow channel; a light source configured to illuminate the set of particles in the flow channel, such that an interference pattern is formed by interference between light being scattered by the set of particles and non-scattered light from the light source; and an image sensor comprising a plurality of photo-sensitive elements configured to detect incident light, the image sensor being configured to detect the interference pattern.
Abstract:
An example method and hyperspectral imaging (HSI) system for imaging a scene are provided. The method is for imaging the scene with the HSI system including a sensor with a plurality of sensor pixels and a plurality of spectral filters, each of the spectral filters being associated with one of the sensor pixels. The method comprises obtaining a higher-resolution spatial image by illuminating the scene with a first set of wavelengths, wherein each spectral filter passes the first set of wavelengths to the sensor pixel it is associated with. The method further comprises obtaining a lower-resolution hyperspectral image by illuminating the scene with a second set of wavelengths, wherein each spectral filter passes only a subset of the second set of wavelengths to the sensor pixel it is associated with.
Abstract:
Embodiments described herein relate to lens-free imaging. One example embodiment may include a lens-free imaging device for imaging a moving sample. The lens-free imaging device may include a radiation source configured to emit a set of at least two different wavelengths towards the moving sample. The lens-free imaging device is configured to image samples for which a spectral response does not substantially vary for a set of at least two different wavelengths. The lens-free imaging device may also include a line scanner configured to obtain a line scan per wavelength emitted by the radiation source and reflected by, scattered by, or transmitted through the moving sample. The line scanner is configured to regularly obtain a line scan per wavelength. Either the radiation source or the line scanner is configured to isolate data of the at least two different wavelengths.
Abstract:
A user device including a camera, a spectrometer module, and a processing unit is disclosed. In one aspect, the camera is adapted to acquire at least one image of a scenery which falls within a field of view of the camera. The spectrometer module is adapted to acquire spectral information from a region within the scenery which region falls within a field of view of the spectrometer module. The processing unit is adapted to determine, based on information relating the field of view of the spectrometer module to the field of view of the camera, a spectrometer module target area, within the at least one image, corresponding to the region. The processing unit is adapted to output display data to a screen of the user device for providing an indication of the target area on the display.
Abstract:
A spectral camera for producing a spectral output is disclosed. The spectral camera has an objective lens for producing an image, a mosaic of filters for passing different bands of the optical spectrum, and a sensor array arranged to detect pixels of the image at the different bands passed by the filters, wherein for each of the pixels, the sensor array has a cluster of sensor elements for detecting the different bands, and the mosaic has a corresponding cluster of filters of different bands, integrated on the sensor element so that the image can be detected simultaneously at the different bands. Further, the filters are first order Fabry-Perot filters, which can give any desired passband to give high spectral definition. Cross talk can be reduced since there is no longer a parasitic cavity.
Abstract:
A method for calibrating an image sensor begins by illuminating a portion of the image sensor with an input light spectrum, where the input light spectrum includes light of known wavelength and intensity. The method continues by sampling an output for each optical sensor of the image sensor, where each optical sensor is associated with one or more optical filters and where each optical filter being associated with a group of optical filters of a plurality of groups of optical filters. Each optical filter of a group of optical filters is configured to pass light in a different wavelength range and at least some optical filters in different groups of the plurality of groups of optical filters are configured to pass light in substantially a same wavelength range. The method then continues by comparing a sampled output for each optical sensor of the plurality of optical sensors with an expected output and generating a calibration factor for each of at least a subset of the plurality of optical sensors and storing the generated calibration factors in memory.