问题
I am trying to implement post-processing (blur, bloom, etc.) on the iPhone using OpenGL ES 2.0. I am running into some issues. When rendering during my second rendering step, I end up drawing a completely black quad to the screen instead of the scene (it appears that the texture data is missing) so I am wondering if the cause is using a single FBO. Is it incorrect to use a single FBO in the following fashion?
For the first pass (regular scene rendering),
- I attach a texture as COLOR_ATTACHMENT_0 and render to a texture. glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texturebuffer, 0)
For the second pass (post-processing),
- I attach the color renderbuffer to COLOR_ATTACHMENT_0 glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, colorRenderbuffer)
- Then use the texture from the first pass for rendering as a quad on the screen.
回答1:
No, that FBO usage is fine. The cause is very likely to be that you didn't set the MIN/MAG filters of your texture (texturebuffer), after binding, set the GL_MIN_FILTER and GL_MAG_FILTER to GL_NEAREST or GL_LINEAR using glTexParameter, and your texture should be working.
回答2:
As Matias states, your design looks fine. I've done something similar myself and had it work well.
I wrote a sample application which uses shaders to process camera images, which can be downloaded here, but that uses glReadPixels()
to pull offscreen values into a processing routine rather than directly mapping the result to a texture onscreen. You might be able to modify it to do that.
Otherwise, I made a more complex shader sample application here that in one part renders a scene to a cube mapped texture, then feeds that texture into a second stage of processing. This should illustrate the rendering to, and use of, a texture with OpenGL ES 2.0 shaders.
Maybe these could shed light on where your implementation is failing.
来源:https://stackoverflow.com/questions/4712042/iphone-post-processing-with-a-single-fbo-with-opengl-es-2-0