I am working on an OpenGL ES fluid simulation which uses texture maps to hold grid values. I need to iterate thru the grid that mimics the following loop:
for (int r = 0; r < 128; r++)
for (int c = 0; c < 128; c++)
process grid element at (c,r)
To iterate through the grid I am simply filling a quadrilateral which causes my fragment program to be invoked for each fragment. The texture coordinates (0,0), (1,0), (0,1), (1,1) are associated with the vertices (-1,-1), (+1,-1), (-1,+1), (+1,+1) and I render the quad (as a triangle strip) into a 128x128 texture map attached to a FBO as follows:
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, texFrameBuffer);
glViewport(0, 0, TEX_IMAGE_WIDTH, TEX_IMAGE_HEIGHT); // 128 x 128
glBindBuffer(GL_ARRAY_BUFFER, vertexBuffer);
glEnableVertexAttribArray(positionAttr);
glVertexAttribPointer(positionAttr, 2, GL_FLOAT, GL_FALSE, 0, 0);
glBindBuffer(GL_ARRAY_BUFFER, texCoordBuffer);
glEnableVertexAttribArray(texCoordAttr);
glVertexAttribPointer(texCoordAttr, 2, GL_FLOAT, GL_FALSE, 0, 0);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
The vertex shader merely passes the position and texture coordinates through unmodified. In order to understand what texture coordinate are being generated, I decided to capture the texture coordinates into an image. The fragment program assigns its given 2D texture coordinates to the red and green channels of the output color:
varying vec4 texCoord;
void main() {
gl_FragColor = vec4(texCoord.st,0,1);
}
I then use glGetTexImage()
to read the texture image back into client space.
The following are samples of the generated texture coordinates (s,t) for each pixel in the image:
(column, row) = [s, t] -> (128*s, 128*t)
(0,0) = [0.00392157, 0.00392157] -> (0,0)
(1,0) = [0.01176471, 0.00392157] -> (1,0)
(2,0) = [0.01960784, 0.00392157] -> (2,0)
(3,0) = [0.02745098, 0.00392157] -> (3,0)
(4,0) = [0.03529412, 0.00392157] -> (4,0)
(5,0) = [0.04313726, 0.00392157] -> (5,0)
(6,0) = [0.05098040, 0.00392157] -> (6,0)
(7,0) = [0.05882353, 0.00392157] -> (7,0)
(8,0) = [0.06666667, 0.00392157] -> (8,0)
(9,0) = [0.07450981, 0.00392157] -> (9,0)
(10,0) = [0.08235294, 0.00392157] -> (10,0)
<snip>
(125,0) = [0.98039222, 0.00392157] -> (125,0)
(126,0) = [0.98823535, 0.00392157] -> (126,0)
(127,0) = [0.99607849, 0.00392157] -> (127,0)
(0,1) = [0.00392157, 0.01176471] -> (0,1)
(1,1) = [0.01176471, 0.01176471] -> (1,1)
(2,1) = [0.01960784, 0.01176471] -> (2,1)
<snip>
(124,127) = [0.97254908, 0.99607849] -> (124,127)
(125,127) = [0.98039222, 0.99607849] -> (125,127)
(126,127) = [0.98823535, 0.99607849] -> (126,127)
(127,127) = [0.99607849, 0.99607849] -> (127,127)
Now to the question. I am trying to understand these generated coordinates.
The magic value 0.00392157
is (0.5 * 1/127.5)
. I understand the 0.5 factor which acts a rounding value that is pre-added in, but why 127.5? (0.5 * 1/128.0)
would make more sense? Can anyone explain these coordinates? I just want to generate integer grid coordinates from texture coordinates (no sampler2Drect
in OpenGL ES).
I always struggle with this, so let's see if I can explain it.
Imagine one row of your grid. In the following picture, I've numbered each of the fragments (0-127) in that row. I've put some texture coordinates below the pixels, too. Notice that the leftmost edge is 0.0 and the rightmost edge is 1.0.
+-----------+-----------+-----------+-----------+--- ---+-----------+
| | | | | | |
| | | | | | |
| 0 | 1 | 2 | 3 | . . . | 127 |
| | | | | | |
| | | | | | |
+-----------+-----------+-----------+-----------+--- ---+-----------+
^ ^ ^ ^ ^ ^
| | | | | |
| | | | | |
0/128 1/128 2/128 3/128 127/128 128/128
When the render wants to texture a fragment, it uses the texture coordinates of the center of the fragment -- where the number is. Notice that the center of fragment 0 is half-way between 0/128 (aka 0) and 1/128 (aka .0078125). That's 1/256 (aka .00390625).
So, i think the formula would be better expressed as:
coordinate = (pixelId + 0.5) / 128
Here's some python that gets answers similar to yours:
for i in range(128):
print (0.5 + i) / 128
0.00390625
0.01171875
0.01953125
0.02734375
0.03515625
0.04296875
...
0.97265625
0.98046875
0.98828125
0.99609375
I suspect that the difference between my results and your results has something to do with the fact that your values were squeezed into one [8-bit] color channel to return them from the shader.
I hope this helps.
来源:https://stackoverflow.com/questions/10289616/interpolated-texture-coordinates