Abstract:
An electronic device comprising: a display; one or more gaze detection sensors for determining a portion of the display to which a user's gaze is currently directed to; a timer to measure periods of time associated with the user's current gaze at the display; and one or more processors operative to: receive data relating to periods of time measured by the timer and determine therefrom a characteristic rate at which the user shifts his gaze from one portion of the display to another; determine a portion of the display towards which the user's gaze was directed for a period of time longer than a period of time which is expected in accordance with his characteristic rate; identify an object included in the determined portion of the display; retrieve information that relates to the identified object; and enable displaying information which is based on the retrieved information.
Abstract:
A natural user interface (NUI) computer processor is provided herein. The NUI computer processor may include: at least one computer processing module; and a plurality of sensors, connected with direct, high bandwidth connectors to the at least one computer processing module, wherein the computer processing module is configured to support a full extent of processing power required for simultaneous multi-modal high resolution information handling gathered by said sensors, wherein the computer processing module and the high bandwidth connectors are cooperatively configured to eliminate any non-vital delays, to reduce latency between human user actions captured by said sensors and response by the NUI computer processor.
Abstract:
A method is provided for use in a stereoscopic image generating system, the system including at least two image capturing sensors and at least one aggregation processor. The at least one aggregation processor is configured to: receive data associated with an image captured by the image capturing sensors; calculate aggregation results for a pre-defined number of disparity levels based on data received from one of the at least two image capturing sensors; estimate aggregation results for data received from another image capturing sensor; and combine the calculated results with the estimated results.