Abstract:
A memory unit stores as conversion information which is derived based on a correction coefficient which provides, in a predetermined allowable range from a minimum value, a sum of a norm of a difference between a color difference of a correction target color and a target color which is a target of correction of the correction target color and a correction color obtained by correcting the correction target color using the correction coefficient, and a norm of the correction coefficient to which a predetermined weight coefficient is applied. A color correcting unit converts a color of each pixel of an image captured by an image capturing unit based on the conversion information.
Abstract:
A method includes: detecting that an index value of an event has satisfied a predetermined condition; acquiring an amount of an action related to the event in a time zone corresponding to a time point when the index value has satisfied the condition based on first information that allows identifying an action performed by a target person at each of a plurality of time points; comparing the acquired amount of the action related to the event in the time zone with an amount of an action related to the event in a past time zone; and identifying a cause corresponding to a comparison result based on second information that allows identifying, for a case where a cause for the index value to satisfy the condition has not occurred, a change tendency of an amount of an action related to the event when the cause has occurred.
Abstract:
An apparatus includes: an infrared light source configured to emit an infrared light within a specific wavelength band; an imaging element configured to output a color signal which corresponds to an incident light; an optical filter configured to be always inserted into an optical path to the imaging element and attenuate an infrared light with a wavelength outside the specific wavelength band; and a color corrector configured to correct the color signal output from the imaging element and approximate spectral sensitivity characteristics of each color of the imaging element in the specific wavelength band of a wavelength band of an infrared light to human cone characteristics.
Abstract:
A line-of-sight analysis method, performed by a computer, includes: identifying, when a selection event with respect to a first object out of a plurality of objects is detected, a second object different from the first object, based on a detection status of line-of-sight relative to the objects in a most recent time period of the selection event; and outputting information indicating the identified second object.
Abstract:
A line-of-sight analysis method, performed by a computer, includes: identifying, when a selection event with respect to a first object out of a plurality of objects is detected, a second object different from the first object, based on a detection status of line-of-sight relative to the objects in a most recent time period of the selection event; and outputting information indicating the identified second object.
Abstract:
A gaze detection apparatus includes a light source which illuminates a user's eye, an imaging unit which generates an image by capturing the image of a user's face, and a processor adapted to: estimate a distance between the imaging unit and the user's face; calculate the ratio of the distance between the imaging unit and the user's face to a separation distance between the imaging unit and the light source and determine, based on the ratio, whether a bright pupil phenomenon occurs; detect from the captured image a corneal reflection image of the light source and a center of the user's pupil when determining that the bright pupil phenomenon does not occur; and detect the user's gaze direction or gaze position based on a positional relationship between the center of the user's pupil and the corneal reflection image when determining that the bright pupil phenomenon does not occur.
Abstract:
An imaging unit has sensitivity to visible light and infrared light and captures an image. When the distribution of the color of each pixel in the image captured by the imaging unit is calculated, a deriving unit derives a predetermined feature amount indicating the range of the color distribution. An estimating unit estimates a lighting environment during imaging based on the feature amount derived by the deriving unit.
Abstract:
A determining unit determines which one of a plurality of divided areas in color distribution a value of an input image single falls within. The plurality of divided areas are defined for each of R, G, and B color components, and the determining unit determines, for each color component, which one of the plurality of divided areas the value of the input image signal falls within. An image correcting unit reads correction coefficients corresponding to the determined divided area for each color component from a correction coefficient storing unit in which correction coefficients are registered for each of the color components and each of the divided areas. The image correcting unit calculates each of R, G, and B output values of an output image signal using the correction coefficients of the individual color components read from the correction coefficient storing unit.
Abstract:
An apparatus receives, via an input device, query input data including a word or a phrase, and acquires search result set data using the query input data. The apparatus acquires, for a value indicating a strength of a relationship between each impression word included in an impression word group and each word included in the query input data, and extracts the first feature word group according to the value indicating the strength of the relationship with each word, from the impression word group. The apparatus displays the search result set data using the first feature word group as an item.
Abstract:
An apparatus receives, via an input device, query input data including a word or a phrase, and acquires search result set data using the query input data. The apparatus acquires, for a value indicating a strength of a relationship between each impression word included in an impression word group and each word included in the query input data, and extracts the first feature word group according to the value indicating the strength of the relationship with each word, from the impression word group. The apparatus displays the search result set data using the first feature word group as an item.