问题
Hey I'm creating small camera app I have implemented every things but I have one problem which is converting NV21 byte array into jpeg format
I have found many way but all of them even not working or work on some devices
firstly i tried this snippet and it work on Xperia z2 5.2 but on galaxy s4 4.4.4
bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
also this way works on same device and fail on the other
int pich = camera.getParameters().getPreviewSize().height;
int picw = camera.getParameters().getPreviewSize().width;
int[] pix = new int[picw * pich];
bitmap.getPixels(pix, 0, picw, 0, 0, picw, pich);
// int R, G, B, Y;
for (int y = 0; y < pich; y++) {
for (int x = 0; x < picw; x++) {
int index = y * picw + x;
int R = (pix[index] >> 16) & 0xff;
int G = (pix[index] >> 8) & 0xff;
int B = pix[index] & 0xff;
pix[index] = 0xff000000 | (R << 16) | (G << 8) | B;
}
}
secondly I tried many solutions to convert decode NV21
first one renderscript
code
public Bitmap convertYUV420_NV21toRGB8888_RenderScript(byte [] data,int W, int H, Fragment fragment) {
// http://stackoverflow.com/questions/20358803/how-to-use-scriptintrinsicyuvtorgb-converting-byte-yuv-to-byte-rgba
RenderScript rs;
ScriptIntrinsicYuvToRGB yuvToRgbIntrinsic;
rs = RenderScript.create(fragment.getActivity());
yuvToRgbIntrinsic = ScriptIntrinsicYuvToRGB.create(rs, Element.U8_4(rs)); //Create an intrinsic for converting YUV to RGB.
Type.Builder yuvType = new Type.Builder(rs, Element.U8(rs)).setX(data.length);
Allocation in = Allocation.createTyped(rs, yuvType.create(), Allocation.USAGE_SCRIPT); //an Allocation will be populated with empty data when it is first created
Type.Builder rgbaType = new Type.Builder(rs, Element.RGBA_8888(rs)).setX(W).setY(H);
Allocation out = Allocation.createTyped(rs, rgbaType.create(), Allocation.USAGE_SCRIPT); //an Allocation will be populated with empty data when it is first created
in.copyFrom(data);//Populate Allocations with data.
yuvToRgbIntrinsic.setInput(in); //Set the input yuv allocation, must be U8(RenderScript).
yuvToRgbIntrinsic.forEach(out); //Launch the appropriate kernels,Convert the image to RGB.
Bitmap bmpout = Bitmap.createBitmap(W, H, Bitmap.Config.ARGB_8888);
out.copyTo(bmpout); //Copy data out of Allocation objects.
return bmpout;
}
and also this code
void decodeYUV420SP(int[] rgb, byte[] yuv420sp, int width, int height) {
final int frameSize = width * height;
for (int j = 0, yp = 0; j < height; j++) {
int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
for (int i = 0; i < width; i++, yp++) {
int y = (0xff & ((int) yuv420sp[yp])) - 16;
if (y < 0)
y = 0;
if ((i & 1) == 0) {
v = (0xff & yuv420sp[uvp++]) - 128;
u = (0xff & yuv420sp[uvp++]) - 128;
}
int y1192 = 1192 * y;
int r = (y1192 + 1634 * v);
int g = (y1192 - 833 * v - 400 * u);
int b = (y1192 + 2066 * u);
if (r < 0)
r = 0;
else if (r > 262143)
r = 262143;
if (g < 0)
g = 0;
else if (g > 262143)
g = 262143;
if (b < 0)
b = 0;
else if (b > 262143)
b = 262143;
rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff);
}
}
}
and finally I tried to save the image on sd card and then reopen it but it also fail
File pictureFile = new File(filename);
int pich = camera.getParameters().getPreviewSize().height;
int picw = camera.getParameters().getPreviewSize().width;
Rect rect = new Rect(0, 0,picw, pich);
YuvImage img = new YuvImage(data, ImageFormat.NV21, picw, picw, null);
try {
FileOutputStream fos = new FileOutputStream(pictureFile);
img.compressToJpeg(rect, 100, fos);
fos.write(data);
fos.close();
and this is the result with last 3 approach I had follow
回答1:
There are a few ways to save an NV21 frame coming from the camera, the easiest way being converting it to a YuvImage then saving it to a Jpeg file:
FileOutputStream fos = new FileOutputStream(Environment.getExternalStorageDirectory() + "/imagename.jpg");
YuvImage yuvImage = new YuvImage(nv21bytearray, ImageFormat.NV21, width, height, null);
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 100, fos);
fos.close();
Alternatively, you can also convert it to Android Bitmap object and save it as a PNG or other format:
YuvImage yuvImage = new YuvImage(nv21bytearray, ImageFormat.NV21, width, height, null);
ByteArrayOutputStream os = new ByteArrayOutputStream();
yuvImage.compressToJpeg(new Rect(0, 0, width, height), 100, os);
byte[] jpegByteArray = os.toByteArray();
Bitmap bitmap = BitmapFactory.decodeByteArray(jpegByteArray, 0, jpegByteArray.length);
FileOutputStream fos = new FileOutputStream(Environment.getExternalStorageDirectory() + "/imagename.png");
bitmap.compress(Bitmap.CompressFormat.PNG, 100, fos);
fos.close();
Note that the last option does a NV21 -> JPEG -> Bitmap Object -> PNG file
process so keep in mind that this is not a very efficient way to save preview images from the camera if you need high performance.
UPDATE: I got tired of how long it takes for this conversion to work so I wrote a library (easyRS) around RenderScript to easily do this in one line of code:
Bitmap outputBitmap = Nv21Image.nv21ToBitmap(rs, nv21ByteArray, width, height);
It is around five times faster than the JPEG process in a 2000x2000 image on a Moto G 2nd.
回答2:
Your second try (with ScriptIntrinsicYuvToRGB) looks promising. With JellyBean 4.3 (API18) or higher do the following (camera preview format must be NV21):
First, don´t create the rs, the yuvToRgbIntrinsic and the allocations in a method or in a loop where the script will be executed. This will slow down your app extremely and may cause out-of-memory errors. Put these in the onCreate().. method:
rs = RenderScript.create(this); // create rs object only once and use it as long as possible
yuvToRgbIntrinsic = ScriptIntrinsicYuvToRGB.create(rs, Element.U8_4(rs)); // ditto.
With Api18+ the allocations are created much easier (you do not need Type.Builder objects !). With your cameraPreviewWidth and cameraPreviewHeight create allocation aIn:
int yuvDatalength = cameraPreviewWidth*cameraPreviewHeight*3/2; // this is 12 bit per pixel
aIn = Allocation.createSized(rs, Element.U8(rs), yuvDatalength);
You need a Bitmap for output:
bmpout = Bitmap.createBitmap(cameraPreviewWidth, cameraPreviewHeight, Bitmap.Config.ARGB_8888);
and simply create the aOut allocation from this Bitmap:
aOut = Allocation.createFromBitmap(rs, bmpout);
Set the script´s in-allocation (only once, outside the loop):
yuvToRgbIntrinsic.setInput(aIn); //Set the input yuv allocation, must be U8(RenderScript).
In the "camera loop" do with byte[] data :
aIn.copyFrom(data); // or aIn.copyFromUnchecked(data); // which is faster and safe with camera data
yuvToRgbIntrinsic.forEach(aOut); //Launch the appropriate kernels,Convert the image to RGB.
aOut.copyTo(bmpout); // copy data from Allocation aOut to Bitmap bmpout
For example, on Nexus 7 (2013, JellyBean 4.3) a full HD (1920x1080 pixel) camera preview conversion takes about 7 ms.
来源:https://stackoverflow.com/questions/32276522/convert-nv21-byte-array-into-bitmap-readable-format