Android Video Player Using NDK, OpenGL ES, and FFmpeg

前端 未结 1 1994
南笙
南笙 2020-12-22 17:22

Ok so here is what I have so far. I have built FFmpeg on android and am able to use it fine. I have been able to load a video into FFmpeg after passing the chosen filename f

相关标签:
1条回答
  • 2020-12-22 18:16

    One way that springs to mind is to draw the pixels of your frame into a texture and then render that texture using OpenGL.

    I wrote a blog post a while back on how to go about this, primarily for old-skool pixel-based video games, but it also applies for your situation. The post is Android Native Coding in C, and I set up a github repository with an example. Using this technique I have been able to get 60 FPS, even on first generation hardware.

    EDIT regarding glTexImage2D vs glTexSubImage2D for this approach.

    Calling glTexImage2D will allocate video memory for your texture and copy the pixels you pass it into that memory (if you don't pass NULL). Calling glTexSubImage2D will update the pixels you specify in an already-allocated texture.

    If you update all of the texture then there's little difference calling one or the other, in fact glTexImage2D is usually faster. But if you only update part of the texture glTexSubImage2D wins out on speed.

    You have to use power-of-2 texture sizes, so in covering the screen on hi-res devices requires a 1024x512 texture, and a 512x512 texture on medium resolutions. The texture is larger than the screen area (hi-res is 800x400-ish), which means you only need to update part of it, so glTexSubImage2D is the way to go.

    0 讨论(0)
提交回复
热议问题