问题
I am currently working with Javacv which makes use of the public void onPreviewFrame(byte[] data, Camera camera)
camera function.
Since camera is deprecated, I have been looking into camera2 and MediaProjection. Both of these libraries make use of the ImageReader class.
Currently I instantiate such an ImageReader
with the following code:
ImageReader.newInstance(DISPLAY_WIDTH, DISPLAY_HEIGHT, PixelFormat.RGBA_8888, 2);
And attach an OnImageAvailableListener
like this:
private final ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() { @Override public void onImageAvailable(ImageReader reader) { mBackgroundHandler.post(new processImage(reader.acquireNextImage())); }
I have tried using the RGBA_8888
format with Javacv as per this thread: https://github.com/bytedeco/javacv/issues/298 but that doesn't work for me.
So instead I was thinking about using Render script to convert these Images to NV21(YUV_420_SP)
format (which is the default output of camera in the onPreviewFrame
function) as that worked for me with the camera
library.
I have also read posts such as this one and this website to do the conversion, but these didn't work for me and I fear they will be too slow. Furthermore, my knowledge of C is severely limited. Basically it looks like I want the reverse operation of https://developer.android.com/reference/android/renderscript/ScriptIntrinsicYuvToRGB.html
So how can you go from an Image
to a byte array that would match the output of the onPreviewFrame
function i.e. NV21(YUV_420_SP)
format? Preferably using Renderscript as it's faster.
Edit 1:
I have tried using ImageFormat.YUV_420_888
but to no avail. I kept getting errors like The producer output buffer format 0x1 doesn't match the ImageReader's configured buffer format
. I switched back to PixelFormat.RGBA_8888
and discovered that there is only one plane in the ImageObject
. The byte buffer of this plane is of size width*height*4 (one byte for R,G,B,A respectively). So I tried to convert this to NV21
format.
I modified code from this answer to produce the following function:
void RGBtoNV21(byte[] yuv420sp, byte[] argb, int width, int height) { final int frameSize = width * height; int yIndex = 0; int uvIndex = frameSize; int A, R, G, B, Y, U, V; int index = 0; int rgbIndex = 0; for (int i = 0; i < height; i++) { for (int j = 0; j < width; j++) { R = argb[rgbIndex++]; G = argb[rgbIndex++]; B = argb[rgbIndex++]; A = argb[rgbIndex++]; // RGB to YUV conversion according to // https://en.wikipedia.org/wiki/YUV#Y.E2.80.B2UV444_to_RGB888_conversion Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16; U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128; V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128; // NV21 has a plane of Y and interleaved planes of VU each sampled by a factor // of 2 meaning for every 4 Y pixels there are 1 V and 1 U. // Note the sampling is every other pixel AND every other scanline. yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y)); if (i % 2 == 0 && index % 2 == 0) { yuv420sp[uvIndex++] = (byte) ((V < 0) ? 0 : ((V > 255) ? 255 : V)); yuv420sp[uvIndex++] = (byte) ((U < 0) ? 0 : ((U > 255) ? 255 : U)); } index++; } } }
and invoke it using:
int mWidth = mImage.getWidth(); int mHeight = mImage.getHeight(); byte[] rgbaBytes = new byte[mWidth * mHeight * 4]; mImage.getPlanes()[0].getBuffer().get(rgbaBytes); mImage.close(); byte[] yuv = new byte[mWidth * mHeight * 3 / 2]; RGBtoNV21(yuv, rgbaBytes, mWidth, mHeight);
Here mImage
is an Image
object produced by my ImageReader
.
Yet this produces a result similar to this image:
which is clearly malformed. Seems like my conversion is off but I cannot figure what exactly.
回答1:
@TargetApi(19) public static byte[] yuvImageToByteArray(Image image) { assert(image.getFormat() == ImageFormat.YUV_420_888); int width = image.getWidth(); int height = image.getHeight(); Image.Plane[] planes = image.getPlanes(); byte[] result = new byte[width * height * 3 / 2]; int stride = planes[0].getRowStride(); assert (1 == planes[0].getPixelStride()); if (stride == width) { planes[0].getBuffer().get(result, 0, width*height); } else { for (int row = 0; row < height; row++) { planes[0].getBuffer().position(row*stride); planes[0].getBuffer().get(result, row*width, width); } } stride = planes[1].getRowStride(); assert (stride == planes[2].getRowStride()); int pixelStride = planes[1].getPixelStride(); assert (pixelStride == planes[2].getPixelStride()); byte[] rowBytesCb = new byte[stride]; byte[] rowBytesCr = new byte[stride]; for (int row = 0; row < height/2; row++) { int rowOffset = width*height + width/2 * row; planes[1].getBuffer().position(row*stride); planes[1].getBuffer().get(rowBytesCb); planes[2].getBuffer().position(row*stride); planes[2].getBuffer().get(rowBytesCr); for (int col = 0; col < width/2; col++) { result[rowOffset + col*2] = rowBytesCr[col*pixelStride]; result[rowOffset + col*2 + 1] = rowBytesCb[col*pixelStride]; } } return result; }
I have published another function with similar requirements. That new implementation tries to take advantage of the fact that quite often, YUV_420_888 is only NV21 in disguise.
来源:https://stackoverflow.com/questions/39067828/android-convert-imagereader-image-to-ycbcr-420-sp-nv21-byte-array-using-render