How do I apply custom filters to single frames in the camera output, and show them.
What I\'ve tried so far:
mCamera.setPreviewCallback(new CameraGre
OK, there are several ways to do this. But there is a significant problem with performance. The byte[] from a camera is in YUV format, which has to be converted to some sort of RGB format, if you want to display it. This conversion is quite expensive operation and significantly lowers the output fps.
It depends on what you actually want to do with the camera preview. Because the best solution is to draw the camera preview without callback and make some effects over the camera preview. That is the usual way to do argumented reallity stuff.
But if you really need to display the output manually, there are several ways to do that. Your example does not work for several reasons. First, you are not displaying the image at all. If you call this:
mCamera.setPreviewCallback(new CameraGreenFilter());
mCamera.setPreviewDisplay(null);
than your camera is not displaying preview at all, you have to display it manually. And you can't do any expensive operations in onPreviewFrame method, beacause the lifetime of data is limited, it's overwriten on the next frame. One hint, use setPreviewCallbackWithBuffer, it's faster, because it reuses one buffer and does not have to allocate new memory on each frame.
So you have to do something like this:
private byte[] cameraFrame;
private byte[] buffer;
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
cameraFrame = data;
addCallbackBuffer(data); //actually, addCallbackBuffer(buffer) has to be called once sowhere before you call mCamera.startPreview();
}
private ByteOutputStream baos;
private YuvImage yuvimage;
private byte[] jdata;
private Bitmap bmp;
private Paint paint;
@Override //from SurfaceView
public void onDraw(Canvas canvas) {
baos = new ByteOutputStream();
yuvimage=new YuvImage(cameraFrame, ImageFormat.NV21, prevX, prevY, null);
yuvimage.compressToJpeg(new Rect(0, 0, width, height), 80, baos); //width and height of the screen
jdata = baos.toByteArray();
bmp = BitmapFactory.decodeByteArray(jdata, 0, jdata.length);
canvas.drawBitmap(bmp , 0, 0, paint);
invalidate(); //to call ondraw again
}
To make this work, you need to call setWillNotDraw(false) in the class constructor or somewhere.
In onDraw, you can for example apply paint.setColorFilter(filter), if you want to modify colors. I can post some example of that, if you want.
So this will work, but the performance will be low (less than 8fps), cause BitmapFactory.decodeByteArray is slow. You can try to convert data from YUV to RGB with native code and android NDK, but that's quite complicated.
The other option is to use openGL ES. You need GLSurfaceView, where you bind camera frame as a texture (in GLSurfaceView implement Camera.previewCallback, so you use onPreviewFrame same way as in regular surface). But there is the same problem, you need to convert YUV data. There is one chance - you can display only luminance data from the preview (greyscale image) quite fast, because the first half of byte array in YUV is only luminance data without colors. So on onPreviewFrame you use arraycopy to copy the first half of the array, and than you bind the texture like this:
gl.glGenTextures(1, cameraTexture, 0);
int tex = cameraTexture[0];
gl.glBindTexture(GL10.GL_TEXTURE_2D, tex);
gl.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_LUMINANCE,
this.prevX, this.prevY, 0, GL10.GL_LUMINANCE,
GL10.GL_UNSIGNED_BYTE, ByteBuffer.wrap(this.cameraFrame)); //cameraFrame is the first half od byte[] from onPreviewFrame
gl.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR);
You cant get about 16-18 fps this way and you can use openGL to make some filters. I can send you some more code to this if you want, but it's too long to put in here...
For some more info, you can see my simillar question, but there is not a good solution either...