问题
I am developing an AR app using Unity for Project Tango.
One of the things I am trying to accomplish is getting the frame image from the device while using the AR example they provided with the SDK - https://github.com/googlesamples/tango-examples-unity
The problem is that they are using the IExperimentalTangoVideoOverlay which doesn't return the frame buffer (The image is converted from YUV to RGB in the shader).
I've registered to OnExperimentalTangoImageAvailable event and called an android native plugin to use glReadPixles in order to get the frame image, but it didn't work also (maybe since it's a library with no context?)
I am open to other solutions... but this is the code for reading the image frame from the android plugin i am using -
public void Decode() {
try {
bb = ByteBuffer.allocateDirect(screenshotSize);
bb.order(ByteOrder.nativeOrder());
GLES30.glReadBuffer(GLES30.GL_BACK);
GLES30.glReadPixels(0, 0, 1920, 1200, GLES30.GL_LUMINANCE, GLES30.GL_UNSIGNED_BYTE, bb);
byte[] myTest = bb.array();
// for test
for (int i=0; i<screenshotSize; ++i)
{
// Check to see if at least one of the bytes is not zero
if (myTest[i] != 0)
{
Log.d("MYAPP", "I AM NOT ZERO!!");
break;
}
}
Decode(myTest, 1920, 1200);
}
catch (Exception e) {
Log.d("MYAPP", "Error: " + e.getMessage());
}
}
Please guys appreciate all the help I can get. Thanks!
Edited:
Problem solved! Thanks to Jason Guo.
I should have used GLES20 and NOT GLES30. Also Android doesn't support GL_LUMINANCE.
The fixed line is - GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, bb);
来源:https://stackoverflow.com/questions/35217052/unity-plugin-using-opengl-for-project-tango