Android byte[] to image in Camera.onPreviewFrame

六月ゝ 毕业季﹏ 提交于 2019-11-27 11:58:40

This has been hard to find! But since API 8, there is a YuvImage class in android.graphics. It's not an Image descendent, so all you can do with it is save it to Jpeg, but you could save it to memory stream and then load into Bitmap Image if that's what you need.

import android.graphics.YuvImage;

@Override
public void onPreviewFrame(byte[] data, Camera camera) {
    try {
        Camera.Parameters parameters = camera.getParameters();
        Size size = parameters.getPreviewSize();
        YuvImage image = new YuvImage(data, parameters.getPreviewFormat(),
                size.width, size.height, null);
        File file = new File(Environment.getExternalStorageDirectory()
                .getPath() + "/out.jpg");
        FileOutputStream filecon = new FileOutputStream(file);
        image.compressToJpeg(
                new Rect(0, 0, image.getWidth(), image.getHeight()), 90,
                filecon);
    } catch (FileNotFoundException e) {
        Toast toast = Toast
                .makeText(getBaseContext(), e.getMessage(), 1000);
        toast.show();
    }
}

Since Android 3.0 you can use a TextureView and TextureSurface to display the camera, and then use mTextureView.getBitmap() to retrieve a friendly RGB preview frame.

A very skeletal example of how to do this is given in the TextureView docs. Note that you'll have to set your application or activity to be hardware accelerated by putting android:hardwareAccelerated="true" in the manifest.

Vinod Maurya

I found the answer after a long time. Here it is...

Instead of using BitmapFactory, I used my custom method to decode this byte[] data to a valid image format. To decode the image to a valid image format, one need to know what picture format is being used by the camera by calling camera.getParameters().getPictureFormat(). This returns a constant defined by ImageFormat. After knowing the format, use the appropriate encoder to encode the image.

In my case, the byte[] data was in the YUV format, so I looked for YUV to BMP conversion and that solved my problem.

you can try this: This example send camera frames to server

 @Override
        public void onPreviewFrame(byte[] data, Camera camera) {
        try {
            byte[] baos = convertYuvToJpeg(data, camera);
            StringBuilder dataBuilder = new StringBuilder();
            dataBuilder.append("data:image/jpeg;base64,").append(Base64.encodeToString(baos, Base64.DEFAULT));
            mSocket.emit("newFrame", dataBuilder.toString());
        } catch (Exception e) {
           Log.d("########", "ERROR");
        }
    }

};


public byte[] convertYuvToJpeg(byte[] data, Camera camera) {

    YuvImage image = new YuvImage(data, ImageFormat.NV21,
            camera.getParameters().getPreviewSize().width, camera.getParameters().getPreviewSize().height, null);

    ByteArrayOutputStream baos = new ByteArrayOutputStream();
    int quality = 20; //set quality
    image.compressToJpeg(new Rect(0, 0, camera.getParameters().getPreviewSize().width, camera.getParameters().getPreviewSize().height), quality, baos);//this line decreases the image quality


    return baos.toByteArray();
}
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!