libyuv

How to convert a kCVPixelFormatType_420YpCbCr8BiPlanarFullRange Buffer to YUV420 using libyuv library in ios?

妖精的绣舞 提交于 2020-01-01 07:11:24
问题 i have captured video using AVFoundation .i have set (video setting )and get in outputsamplebuffer kCVPixelFormatType_420YpCbCr8BiPlanarFullRange format. But i need YUV420 format for further processing. For that i use libyuv framework. LIBYUV_API int NV12ToI420(const uint8* src_y, int src_stride_y, const uint8* src_uv, int src_stride_uv, uint8* dst_y, int dst_stride_y, uint8* dst_u, int dst_stride_u, uint8* dst_v, int dst_stride_v, int width, int height); libyuv::NV12ToI420(src_yplane,

NV12 format and UV plane

守給你的承諾、 提交于 2019-12-23 10:14:05
问题 i am a little confused about the NV12 format. i am looking the this page to understand the format. What i currently understand is that if you have an image or video of 640 x 480 diminsion then the Y plane will be having 640 x 480 bytes and U and V both planes have 640/2 x 480/2 . It does not mean that U plane have 640/2 x 480/2 and V plane have 640/2 x 480/2 both have only 640/2 x 480/2 bytes. so the total number of bytes in out buffer array will be. 2 is multiplied with (640/2) * (480/2)

How to convert android camera preview data in NV21 format to i420 by libyuv?

喜欢而已 提交于 2019-12-12 17:16:04
问题 I receive the preview data in NV21 format by: public void onPreviewFrame(byte[] data, Camera camera) I want to convert data to I420 format by libyuv. It seems NV21ToI420 or ConvertToI420 in include/libyuv/convert.h is what i need. // Convert NV21 to I420. LIBYUV_API int NV21ToI420(const uint8* src_y, int src_stride_y, const uint8* src_vu, int src_stride_vu, uint8* dst_y, int dst_stride_y, uint8* dst_u, int dst_stride_u, uint8* dst_v, int dst_stride_v, int width, int height); // Convert camera

rendering YUV ffmpeg frames in Android native ndk

こ雲淡風輕ζ 提交于 2019-12-11 20:13:03
问题 Can we render YUV frames from ffmpeg streaming output(AV_PIX_FMT_YUV420P) directly in Android screen without converting to RGB format? 回答1: I've had some experience with Google WebRTC open source project recently. It provide a fully packaged video call example, also contains an Android demo. What the demo doing is displaying decoded video frames, which is I420(YUV420P) pixel format. Take a look at the source code: https://code.google.com/p/webrtc/source/browse/trunk/webrtc/modules/video

Problems when scaling a YUV image using libyuv library

拈花ヽ惹草 提交于 2019-12-01 02:06:34
I'm developing a camera app based on Camera API 2 and I have found several problems using the libyuv . I want to convert YUV_420_888 images retrieved from a ImageReader, but I'm having some problems with scaling in a reprocessable surface. In essence: Images come out with tones of green instead of having the corresponding tones (I'm exporting the .yuv files and checking them using http://rawpixels.net/ ). You can see an input example here: And what I get after I perform scaling: I think I am doing something wrong with strides, or providing an invalid YUV format (maybe I have to transform the

Problems when scaling a YUV image using libyuv library

你离开我真会死。 提交于 2019-11-30 21:29:19
问题 I'm developing a camera app based on Camera API 2 and I have found several problems using the libyuv. I want to convert YUV_420_888 images retrieved from a ImageReader, but I'm having some problems with scaling in a reprocessable surface. In essence: Images come out with tones of green instead of having the corresponding tones (I'm exporting the .yuv files and checking them using http://rawpixels.net/). You can see an input example here: And what I get after I perform scaling: I think I am