Android org.webrtc.VideoRenderer.I420Frame arrays from an image

岁酱吖の 提交于 2019-12-11 08:24:21

问题


I keep hoping some code will appear on the internet, but getting nowhere ;) I am running this github example. WebRTC incoming I420Frame object seems to have 3 arrays of yuvPlanes

A typical Android camera app gets PreviewCallback.onPreviewFrame byte[] as a single array of bytes. My job is to stream an image as I420 at regular interval of time. Can someone help me in how to generate a I420Frames yuvPlanes from single byte[] array like JPEG/PNG file?

It is pretty critical. All Answers appreciated.


回答1:


PreviewCallback.onPreviewFrame() will never return JPEG or PNG stream. You should check your camera getSupportedPreviewFormats() list (note that this may differ for front and rear cameras). You are guaranteed to have NV21 in this list. If you are lucky, you can choose YV12 since API level 12 (note that some devices, e.g. Amazon Fire HD (2012), lie about this and actually cannot deliver YV12 stream).

It's easy to build a I420Frame from a YV12 byte array:

private VideoRenderer.I420Frame mFrame;
void onPreviewFrame(byte[] yv12_data, Camera camera) {
    if (mFrame == null) {
        Camera.Parameters params = camera.getParameters(); // this is an expensive call, don't repeat it on every frame!
        assert(params.getPreviewFormat() == ImageFormat.YV12);
        int width = params.getPreviewSize().width;
        int stride_y = 16 + ((width-1)/16)*16;
        int stride_uv = 16 + ((stride_y/2-1)/16)*16;
        int height = params.getPreviewSize().height; 
        mFrame = new VideoRenderer.I420Frame(width, height, 0, new int[]{stride_y, stride_uv, stride_uv}, new ByteBuffer[3], 0);
    }

    mFrame.yuvPlanes[0] = ByteBuffer.wrap(yv12_data, 0, mFrame.yuvStrides[0]*mFrame.height) // Y
    mFrame.yuvPlanes[1] = ByteBuffer.wrap(yv12_data, mFrame.yuvStrides[0]*mFrame.height+mFrame.yuvStrides[2]*mFrame.height/2, mFrame.yuvStrides[1]*mFrame.height/2) // U
    mFrame.yuvPlanes[2] = ByteBuffer.wrap(yv12_data, mFrame.yuvStrides[0]*mFrame.height, mFrame.yuvStrides[2]*mFrame.height/4) // V

    ... do something with the frame
}

For NV21, you must allocate the U and V planes:

private VideoRenderer.I420Frame mFrame;
void onPreviewFrame(byte[] nv21_data, Camera camera) {
    if (mFrame == null) {
        Camera.Parameters params = camera.getParameters(); // this is an expensive call, don't repeat it on every frame!
        assert(params.getPreviewFormat() == ImageFormat.NV21);
        int width = params.getPreviewSize().width;
        int height = params.getPreviewSize().height; 
        mFrame = new VideoRenderer.I420Frame(width, height, 0, new int[]{width, width/2, width/2}, new ByteBuffer[3], 0);
        mFrame.yuvPlanes[1] = ByteBuffer.wrap(new byte[width*height/4]);
        mFrame.yuvPlanes[2] = ByteBuffer.wrap(new byte[width*height/4]);
    }

    mFrame.yuvPlanes[0] = ByteBuffer.wrap(nv21_data, 0, mFrame.width*mFrame.height) // Y
    for (int top=0, from=mFrame.width*mFrame.height; from < mFrame.width*mFrame.height*3/2; to++, from+=2) {
        mframe.yuvPlanes[1][to] = nv21_data[from+1]; // U
        mframe.yuvPlanes[2][to] = nv21_data[from]; // V
    }

    ... do something with the frame
}



回答2:


 I420Frame onPreviewFrame(byte[] yv12_data)
        {
            if (mFrame == null)
            {
                //Camera.Parameters params = camera.getParameters(); // this is an expensive call, don't repeat it on every frame!
                //assert(params.getPreviewFormat() == ImageFormat.YV12);
                int width = 640;
                int stride_y = 16 + ((width - 1) / 16) * 16;
                int stride_uv = 16 + ((stride_y / 2 - 1) / 16) * 16;
                int height = 480;
                mFrame = new VideoRenderer.I420Frame(width, height,  new int[] { stride_y, stride_uv, stride_uv }, new ByteBuffer[3]);
            }                

            mFrame.YuvPlanes[0] = ByteBuffer.Wrap(yv12_data, 0, mFrame.YuvStrides[0] * mFrame.Height); // Y
            mFrame.YuvPlanes[1] = ByteBuffer.Wrap(yv12_data, (mFrame.YuvStrides[0] * mFrame.Height) , mFrame.YuvStrides[1] * mFrame.Height );// U
            mFrame.YuvPlanes[2] = ByteBuffer.Wrap(yv12_data, (mFrame.YuvStrides[0] * mFrame.Height )+ (mFrame.YuvStrides[1] * mFrame.Height), mFrame.YuvStrides[2] * mFrame.Height ); // V
            return mFrame;
            //  ... do something with the frame
        }


来源:https://stackoverflow.com/questions/41841257/android-org-webrtc-videorenderer-i420frame-arrays-from-an-image

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!