问题
I'm using the depth buffer of the current context to influence a texture I am displaying. The texture is 1 dimensional and in grayscale. From left to right represents from near to far. The more pixels are at a certain depth the brighter the texture is at that point with black being no pixels are at that depth and white being all pixels are at that depth.
Now I have a solution that does glReadPixels()
on the depth-buffer, analyzes it on the CPU and then writes it back to the texture. Naturally this is a real bottleneck in the application.
I'm looking for an all GPU solution where the depth buffer is analyzed in a shader or somesuch and update the texture that way. I thought about creating a fragment shader that reads the depth value and increases to the corresponding pixel in the texture, but that would require that fragment shaders can write to other textures. Something I've learned to be a no-no, especially if they have to write to the same pixel.
Is there a trick or technique that I am missing or am I forced to involve the CPU in this?
回答1:
Luckily there's a trick: vertex shaders can sample textures too. So you can issue a lot of GL_POINTS, each corresponding to an individual fragment in the depth texture, then in the vertex shader you can read from the depth texture to determine the transformed position of the point. In your fragment shader for the points just plot a value with a suitable alpha to cause the accumulation you desire.
So, you've got the vertex shader reading one texture, the fragment shader not reading any textures and you're using the normal render-to-texture mechanism to write to your histogram.
回答2:
As an upgrade to @Tommy's proposal I would suggest using PBO instead of the vertex texture fetch:
- Copy depth buffer into PBO.
- Bind PBO as VBO, bind vertex attribute (depth) in it.
- Call DrawArrays with as many elements as your depth buffer resolution. 4...
来源:https://stackoverflow.com/questions/5647727/create-depth-buffer-histogram-texture-with-glsl