问题
I'm trying to do pretty much what TangoARScreen does but with multithreaded rendering on in Unity. I did some experiments and I'm stuck.
I tried several things, such as letting Tango render into the OES texture that would be then blitted into a regular Texture2D in Unity, but OpenGL keeps complaining about invalid display when I try to use it. Probably not even OnTangoCameraTextureAvailable is called in the correct GL context? Hard to say when you have no idea how Tango Core works internally.
Is registering a YUV texture via TangoService_Experimental_connectTextureIdUnity the way to go? I'd have to deal with YUV2RGB conversion I assume. Or should I use OnTangoImageMultithreadedAvailable and deal with the buffer? Render it with a custom share for instance? The documentation is pretty blank in these areas and every experiment means several wasted days at least. Did anyone get this working? Could you point me in the right direction? All I need is live camera image rendered into Unity's camera background.
回答1:
Frome the April 2017: Gankino release notes: "The C API now supports getting the latest camera image's timestamp outside a GL thread .... Unity multithreaded rendering support will get added in a future release.". So I guess we need to wait a little bit. Multithreaded rendering still can be used in applications without camera feed (with motion tracking only), choosing "YUV Texture and Raw Bytes" as overlay method in Tango Application Script.
来源:https://stackoverflow.com/questions/41324870/camera-texture-in-unity-with-multithreaded-rendering