yuv

Nexus 9 Camera2 API - YUV_420_888 vs. getOutputSizes()

本秂侑毒 提交于 2019-12-12 10:26:15
问题 I'm implementing the Camera2 API with the YUV_420_888 format on a Nexus 9. I checked the output sizes and wanted to use the largest (8MP, 3280 x 2460) size to save. However, it just appears as static lines, similar to how old TV's looked without a signal. I would like to stick with YUV_420_888 since my end goal is to save grayscale data (Y component). I originally thought it was a camera bandwidth issue, but the same thing happened at some of the small sizes (320 x 240). None of the problems

Converting RGB to YUV, + ffmpeg

心不动则不痛 提交于 2019-12-12 02:48:13
问题 I am trying the following to record a live video from my Flash/AIR application: I take a "screenshot" (BitmapData from stage) each frame. I convert each pixel to yuv format like this (V2) : var file :File = new File(_appUrl + "/creation/output.raw"); var fs :FileStream = new FileStream(); fs.open(file, FileMode.WRITE); var finalY :ByteArray = new ByteArray(); var finalU :ByteArray = new ByteArray(); var finalV :ByteArray = new ByteArray(); var rect :Rectangle = new Rectangle(0, 0, 600, 700);

How to convert YUV420SP to RGB and display it?

邮差的信 提交于 2019-12-11 19:28:23
问题 Im trying to render a video frame using android NDK. Im using this sample of google Native-Codec NDK sample code and modified it so I can manually display each video frame (non-tunneled). so I added this code to get the output buffer which is in YUV. ANativeWindow_setBuffersGeometry(mWindow, bufferWidth, bufferHeight, WINDOW_FORMAT_RGBA_8888 uint8_t *decodedBuff = AMediaCodec_getOutputBuffer(d->codec, status, &bufSize); auto format = AMediaCodec_getOutputFormat(d->codec); LOGV("VOUT: format

NV21 to I420 in android

我与影子孤独终老i 提交于 2019-12-11 13:36:58
问题 EDIT : I came across this libyuv that does the NV21 to I420 conversion, but i don't really understand how to call it. // Convert NV21 to I420. Same as NV12 but u and v pointers swapped. LIBYUV_API int NV21ToI420(const uint8* src_y, int src_stride_y, const uint8* src_vu, int src_stride_vu, uint8* dst_y, int dst_stride_y, uint8* dst_u, int dst_stride_u, uint8* dst_v, int dst_stride_v, int width, int height) I am passing the NV21 byte[] obtained from camera callback to the jni layer and

About converting YUV(YV12) to RGB with GLSL for iOS

落爺英雄遲暮 提交于 2019-12-11 13:32:00
问题 I'm trying to convert YUV(YV12) to RGB with GLSL shader. As below step. read a raw YUV(YV12) data from image file filtering Y, Cb and Cr from the raw YUV(YV12) data mapping texture send Fragment Shader. but result image is not same as raw data. below image is raw data. screenshot of raw image link(Available for download) and below image is convert data. screenshot of convert image link(Available for download) and below is my source code. - (void) readYUVFile { ... NSData* fileData = [NSData

Y'UV420p (and Y'V12 or YV12) to RGB888 conversion

醉酒当歌 提交于 2019-12-11 11:12:15
问题 Am trying to show a yuv video file in android, I have a few yuv video files that am using. This video yuv file video1 (160*120 resolution) is one that I captured from my server as raw h264 data and converted to yuv file using OpenH264. I used YUV Player Deluxe to play the above yuv video files and it plays perfectly well. When I try to play the same in Android am not getting the color component reproduced properly.The image almost appears black and white with a few traces of color in between

Channel and Bit values of a YUV2 data

人走茶凉 提交于 2019-12-11 11:00:00
问题 I have a data stream coming from a Camera(PAL) The data type I got from callback function is in a format like U0-Y0-V0-Y1 U2-Y2-V2-Y3 U4-Y4-V4-Y5 ...... I need to change the color format to RGB (or BGR) by using OpenCV's cvCvtColor() function. Usage of the function is cvCvtColor(YCrCb, dst, CV_YCrCb2BGR); Now here(actually before) comes the problem, dst is a 3 channel 8U image, thats OK, but how can I store the data coming from callback function in an IplImage directly ? If I can store it

Android org.webrtc.VideoRenderer.I420Frame arrays from an image

岁酱吖の 提交于 2019-12-11 08:24:21
问题 I keep hoping some code will appear on the internet, but getting nowhere ;) I am running this github example. WebRTC incoming I420Frame object seems to have 3 arrays of yuvPlanes A typical Android camera app gets PreviewCallback.onPreviewFrame byte[] as a single array of bytes. My job is to stream an image as I420 at regular interval of time. Can someone help me in how to generate a I420Frames yuvPlanes from single byte[] array like JPEG/PNG file? It is pretty critical. All Answers

GPU YUV to RGB. Worth the effort?

江枫思渺然 提交于 2019-12-11 03:47:47
问题 I have to convert several full PAL videos (720x576@25) from YUV 4:2:2 to RGB, in real time, and probably a custom resize for each. I have thought of using the GPU, as I have seen some example that does just this (except that it's 4:4:4 so the bpp is the same in source and destiny)-- http://www.fourcc.org/source/YUV420P-OpenGL-GLSLang.c However, I don't have any experience with using GPU's and I'm not sure of what can be done. The example, as I understand it, just converts the video frame to

DirectShow RGB-YUV filter

帅比萌擦擦* 提交于 2019-12-11 02:53:11
问题 I would like to encode video in my app with VP8. I use RGB24 format in my app but VP8 DirectShow filter accepts only YUV format (http://www.webmproject.org/tools/#directshow_filters). I've googled the "RGB to YUV directshow filter" but no success. I don't want to write this filter myself from scratch, so I would appreciate if you help me with the information on where to find such filter. Thanks! 回答1: You could try Geraint Davies' YUV transform filter to see if it supports the conversion. 回答2: