Abstract:
An electronic device may have components that experience performance variations as the device changes orientation relative to a user. Changes in the orientation of the device relative to the user can be monitored using a motion sensor. A camera may be used to periodically capture images of a user's eyes. By processing the images to produce accurate orientation information reflecting the position of the user's eyes relative to the device, the orientation of the device tracked by the motion sensor can be periodically updated. The components may include audio components such as microphones and speakers and may include a display with an array of pixels for displaying images. Control circuitry in the electronic device may modify pixel values for the pixels in the array to compensate for angle-of-view-dependent pixel appearance variations based on based on the orientation information from the motion sensor and the camera.
Abstract:
An electronic device may include a display and control circuitry that operates the display. The control circuitry may be configured to daltonize input images to produce daltonized output images that allow a user with color vision deficiency to see a range of detail that the user would otherwise miss. The daltonization algorithm that the control circuitry applies to input images may be specific to the type of color vision deficiency that the user has. The daltonization strength that the control circuitry applies to the image or portions of the image may vary based on image content. For example, natural images may be daltonized with a lower daltonization strength than web browsing content, which ensures that memory colors such as blue sky and green grass do not appear unnatural to the user while still allowing important details such as hyperlinks and highlighted text to be distinguishable.
Abstract:
This disclosure relates to image capture devices with the ability to perform adaptive white balance correction using a switchable white reference (SWR). In some embodiments, the image capture device utilizes “true white” information to record images that better represent users' perceptions. In other embodiments, the same SWR and camera that dynamically sample ambient lighting conditions are used to determine “true white” in near real-time. In other embodiments, the image capture device comprises a display screen that utilizes the “true white” information in near real-time to dynamically adjust the display. In other embodiments, face detection techniques and/or ambient light sensors may be used to determine which device camera is most closely-aligned with the direction that the user of the device is currently looking in, and using it to capture a “true white” image in the direction that most closely corresponds to the ambient lighting conditions that currently dominate the user's perception.
Abstract:
An electronic device may include a display having an array of display pixels and having display control circuitry that controls the operation of the display. The display control circuitry may adaptively adjust the display output based on ambient lighting conditions. For example, in cooler ambient lighting conditions such as those dominated by daylight, the display may display neutral colors using a relatively cool white. When the display is operated in warmer ambient lighting conditions such as those dominated by indoor light sources, the display may display neutral colors using a relatively warm white. Adapting to the ambient lighting conditions may ensure that the user does not perceive color shifts on the display as the user's vision chromatically adapts to different ambient lighting conditions. Adaptively adjusting images in this way can also have beneficial effects on the human circadian rhythm by displaying warmer colors in the evening.
Abstract:
Systems and methods for reducing or eliminating image artifacts on a dual-layer liquid crystal display (LCD). By way of example, a system includes a first display panel and a second display panel. The system includes a processor coupled to the first display panel and the second display panel, and configured to generate a first image, and to generate a second image to be displayed on the first display panel based on the first image. The processor is configured to interpolate the second image. Interpolating the second image includes adjusting the second image according to a generated objective function bounded by a first constraint. The processor is configured to filter the second image, and to generate a third image to be displayed on the second display panel based on the first image and the second image. The third image is generated to prevent image artifacts on the second display panel.
Abstract:
An electronic device that includes a display and an eye tracker configured to collect eye tracking data regarding a gaze of one or more of a user's eyes across the display is disclosed herein. The electronic device includes processing circuitry that is operatively coupled to the display and configured to foveate one or more areas of the display according to the eye tracking data. If the eye tracking data input is lost, the processing circuitry is configured to recover from the loss of eye tracking data by changing one or more aspects of the foveated areas (e.g., size, resolution, etc.) until a threshold is satisfied. As time elapses since loss of eye tracking, the foveated areas move toward a center or a salient region of the display.
Abstract:
In some implementations, a mobile computing device may participate in the calibration of an output signal of a media device. This calibration process includes storing device-specific calibration data which is related to properties of a light sensor of the mobile device. The mobile device then detects of properties of light emitted by the display device during a presentation to obtain sensor values related to light emitted by the display device during the presentation. The calibration process may also ensure that the mobile device is proximate to the display device prior to obtaining the sensor values. The collected sensor values are adjusted using device-specific calibration data stored to the mobile device to normalize the sensor values relative to a baseline. These normalized sensor values are sent to the media device for use in adjusting the output signal based on the normalized sensor values.
Abstract:
A display may have display layers that form an array of pixels. An angle-of-view adjustment layer may overlap the display layers. The angle-of-view adjustment layer may include an array of adjustable light blocking structures formed from electrochromic material. The electrochromic material may be interposed between first and second electrode layers. When it is desired to operate the display in a private viewing mode, control circuitry may apply a current to the first and second electrodes that causes the electrochromic material to become more opaque, thereby restricting the angle of view of the display. When it is desired to operate the display in a public viewing mode, control circuitry may apply a current to the first and second electrodes that causes the electrochromic material to become more transparent, thereby opening up the angle of view of the display.
Abstract:
An electronic device such as a cellular telephone or other device may have a housing with front and rear faces joined by a sidewall. A display may be mounted on the front face. The electronic device may include multiple ambient light sensors such as a front ambient light sensor on the front face and one or more supplemental ambient light sensors on the rear face and/or on the sidewall. The front ambient light sensor gathers a front ambient light intensity measurement and the supplemental ambient light sensor gathers a supplemental ambient light intensity measurement. During operation, control circuitry in the electronic device adjusts the display brightness based on data from the ambient light sensors. The control circuitry may implement power saving restrictions that limit when the supplemental ambient light intensity measurement is taken into account and/or that impose a brightness cap on the display brightness.
Abstract:
In some implementations, a mobile computing device may participate in the calibration of an output signal of a media device. This calibration process includes storing device-specific calibration data which is related to properties of a light sensor of the mobile device. The mobile device then detects of properties of light emitted by the display device during a presentation to obtain sensor values related to light emitted by the display device during the presentation. The calibration process may also ensure that the mobile device is proximate to the display device prior to obtaining the sensor values. The collected sensor values are adjusted using device-specific calibration data stored to the mobile device to normalize the sensor values relative to a baseline. These normalized sensor values are sent to the media device for use in adjusting the output signal based on the normalized sensor values.