Abstract:
Methods, systems, and devices for video frame rendering are described. A device, such as a user equipment (UE) may receive a set of video packets over a video connection (e.g., a video telephone service) and generate a set of video frames based on the set of video packets. The device may determine to render a video frame of the set of video frames based on a frame type of the video frame or a rendering criteria including a video quality of the video frame, or both. In some examples, the frame type may include a perfect frame or a corrupted frame. The device may render the video frame of the set of video frames based on the frame type of the video frame or the video frame satisfying the rendering criteria, or both, and output the rendered video frame for display.
Abstract:
Methods, systems, and devices for video frame rendering are described. A device, such as a user equipment (UE) may receive a set of video packets over a video connection (e.g., a video telephone service) and generate a set of video frames based on the set of video packets. The device may determine to render a video frame of the set of video frames based on a frame type of the video frame or a rendering criteria including a video quality of the video frame, or both. In some examples, the frame type may include a perfect frame or a corrupted frame. The device may render the video frame of the set of video frames based on the frame type of the video frame or the video frame satisfying the rendering criteria, or both, and output the rendered video frame for display.
Abstract:
Methods, systems, and devices are described for media synchronization. Multi-stream media processes may include media streams captured with respect to different clock rates. Multi-processor implementations may involve separate clocks associated with different media streams, such as audio and video, respectively. The separate clocks may tend to drift from one another, becoming further out of sync as time passes. Selecting a reference time of one of the processors to function as a “wall clock,” recording frame capture times with respect to the reference time, accounting for propagation delays, and transmitting frame capture times in terms of the reference time may aid in AV synchronization at a device where audio and video streams are received.
Abstract:
The present disclosure relates to methods and devices for wireless communication of an apparatus, e.g., a UE. In one aspect, the apparatus may determine whether a connection of a video call is interrupted, the video call including a plurality of decoded frames. The apparatus may also determine, if the connection of the video call is interrupted, whether one or more decoded frames of the plurality of decoded frames are suitable for artificial frame generation. The apparatus may also generate one or more artificial frames based on the one or more decoded frames and an audio feed from a transmitting device. Additionally, the apparatus may determine whether the one or more artificial frames are suitable for a facial model call. The apparatus may also establish a facial model call based on a combination of the one or more artificial frames and the audio feed from the transmitting device.
Abstract:
The present disclosure relates to methods and devices for wireless communication of an apparatus, e.g., a UE. In one aspect, the apparatus may determine whether a connection of a video call is interrupted, the video call including a plurality of decoded frames. The apparatus may also determine, if the connection of the video call is interrupted, whether one or more decoded frames of the plurality of decoded frames are suitable for artificial frame generation. The apparatus may also generate one or more artificial frames based on the one or more decoded frames and an audio feed from a transmitting device. Additionally, the apparatus may determine whether the one or more artificial frames are suitable for a facial model call. The apparatus may also establish a facial model call based on a combination of the one or more artificial frames and the audio feed from the transmitting device.
Abstract:
Methods, systems, and devices are described for media synchronization. Multi-stream media processes may include media streams captured with respect to different clock rates. Multi-processor implementations may involve separate clocks associated with different media streams, such as audio and video, respectively. The separate clocks may tend to drift from one another, becoming further out of sync as time passes. Selecting a reference time of one of the processors to function as a “wall clock,” recording frame capture times with respect to the reference time, accounting for propagation delays, and transmitting frame capture times in terms of the reference time may aid in AV synchronization at a device where audio and video streams are received.