Abstract:
A multi-core processing system, such as a system-on-chip (SOC), is configured to use interrupts when sending processed data from a source processing core to a destination processing core. The source processing core may delay sending interrupts, but still keep processing data, when an acknowledgment for a previous interrupt is not received from an inter-processor communication controller. When the acknowledgment is received, the source processing core may resume sending an interrupt for the next chunk of data processed. As such, not all chunks of data may have associated interrupts.
Abstract:
Examples are described of automatic exposure timing synchronization. An imaging system includes a first image sensor configured to capture a first image according to a first exposure timing, including by exposing first region of interest (ROI) image data at the first image sensor for a first ROI exposure time period. Based on the first exposure timing, the imaging system sets a second exposure timing for a second image sensor to capture a second image. Capture of the second image according to the second exposure timing includes exposure of second ROI image data at the second image sensor for a second ROI exposure time period. The second exposure timing may be set so that the start of the second ROI exposure time period aligns with the start of the first ROI exposure time period, and/or so that the first and second ROI exposure time periods overlap.
Abstract:
A method for camera processing using a camera application programming interface (API) is described. A processor executing the camera API may be configured to receive instructions that specify a use case for a camera pipeline, the use case defining at least one or more processing engines of a plurality of processing engines for processing image data with the camera pipeline, wherein the plurality of processing engines includes one or more of fixed-function image signal processing nodes internal to a camera processor and one or more processing engines external to the camera processor. The processor may be further configured to route image data to the one or more processing engines specified by the instructions, and return the results of processing the image data with the one or more processing engines to the application.
Abstract:
This disclosure provides systems, methods, and devices for wireless communication that support improved routing of image sensors that share a PHY within different secure domains. In a first aspect, a device may receive a packet from an image sensor along a physical data connection. The device may determine a virtual channel associated with the packet and may determining a secure domain for the packet based on the virtual channel. The first secure domain may be selected from a plurality of secure domains accessible via the physical data connection, such as based on a mapping maintained by the device. The device may then route the packet within the first secure domain such that further processing and storage of the packet occurs within the first secure domain, such as within a context base associated with the first secure domain. Other aspects and features are also claimed and described.
Abstract:
In some aspects, the present disclosure provides a method for sharing a single optical sensor between multiple image processors. In some embodiments, the method includes receiving, at a control arbiter, a first desired configuration of a first one or more desired configurations for capturing an image frame by the optical sensor, the first one or more desired configurations communicated from a primary image processor. The method may also include receiving, at the control arbiter, a second desired configuration of a second one or more desired configurations for capturing the image frame by the optical sensor, the second one or more desired configurations communicated from a secondary image processor. The method may also include determining, by the control arbiter, an actual configuration for capturing the image frame by the optical sensor, the actual configuration based on the first desired configuration and the second desired configuration.
Abstract:
A method for camera processing using a camera application programming interface (API) is described. A processor executing the camera API may be configured to receive instructions that specify a use case for a camera pipeline, the use case defining at least one or more processing engines of a plurality of processing engines for processing image data with the camera pipeline, wherein the plurality of processing engines includes one or more of fixed-function image signal processing nodes internal to a camera processor and one or more processing engines external to the camera processor. The processor may be further configured to route image data to the one or more processing engines specified by the instructions, and return the results of processing the image data with the one or more processing engines to the application.
Abstract:
A method for camera processing using a camera application programming interface (API) is described. A processor executing the camera API may be configured to receive instructions that specify a use case for a camera pipeline, the use case defining at least one or more processing engines of a plurality of processing engines for processing image data with the camera pipeline, wherein the plurality of processing engines includes one or more of fixed-function image signal processing nodes internal to a camera processor and one or more processing engines external to the camera processor. The processor may be further configured to route image data to the one or more processing engines specified by the instructions, and return the results of processing the image data with the one or more processing engines to the application.
Abstract:
Described herein are methods, apparatus, and computer readable medium to autofocus a lens of an imaging device. Parameters are received indicating a lens position. Lens actuator characteristics are determined. Lens damping parameters may be determined based, at least in part, on the input parameters and the lens actuator characteristics. In some aspects, lens damping parameters include a lens movement step size and a time delay between each step. In some aspects, the lens damping parameters include damping parameters for a plurality of regions of lens movement. Lens movement parameters are determined based, at least in part, on the input parameters and the lens damping parameters. The lens is then autofocused by moving it according to the lens movement parameters.
Abstract:
Described herein are methods, apparatus, and computer readable medium to autofocus a lens of an imaging device. Parameters are received indicating a lens position. Lens actuator characteristics are determined. Lens damping parameters may be determined based, at least in part, on the input parameters and the lens actuator characteristics. In some aspects, lens damping parameters include a lens movement step size and a time delay between each step. In some aspects, the lens damping parameters include damping parameters for a plurality of regions of lens movement. Lens movement parameters are determined based, at least in part, on the input parameters and the lens damping parameters. The lens is then autofocused by moving it according to the lens movement parameters.