问题
Say I have an image of size 320x240. Now, sampling from an sampler2D
with integer image coordinates ux, uy
I must normalize for texture coordinates in range [0, size] (size may be width or height).
Now, I wonder if I should normalize like this
texture(image, vec2(ux/320.0, uy/240.0))
or like this
texture(image, vec2(ux/319.0, uy/239.0))
Because ux = 0 ... 319 and uy = 0 ... 239. The latter one will actually cover the whole range of [0, 1] correct? That means 0 corresponds to the e.g. left-most pixels and 1 corresponds to the right most pixels, right?
Also I want to maintain filtering, so I would like to not use texelFetch
.
Can anyone tell something about this? Thanks.
回答1:
No, the first one is actually correct:
texture(image, vec2(ux/320.0, uy/240.0))
Your premise that "ux = 0 ... 319 and uy = 0 ... 239" is incorrect. If you render a 320x240 quad, say, then it is actually ux = 0 ... 320 and uy = 0 ... 240.
This is because pixels and texels are squares sampled at half-integer coordinates. So, for example, let's assume that you render your 320x240 texture on a 320x240 quad. Then the bottom-left pixel (0,0) will actually be sampled at screen-coordinates (.5,.5). You normalize it by dividing by (320,240), but then OpenGL will multiply the normalized coordinates back by (320,240) to get the actual texel coordinates, so it will sample (.5,.5) from the texture, which corresponds to the center of the (0,0) pixel, which returns its exact color.
It is important to think of pixels in OpenGL as squares, so that coordinates (0,0) correspond the bottom-left corner of the bottom-left pixel and the non normalized (w,h) corresponds to the top-right corner of the top-right pixel (for texture of size (w,h)).
回答2:
Texture coordinates (and pixel coordinates) go from 0 to 1 on the edges of the pixels no matter how many pixels.
A 4 pixel wide texture
0 0.5 1 <- texture coordinate
| | |
V V v
+-----+-----+-----+-----+
| | | | |
| | | | | <- texels
+-----+-----+-----+-----+
A 5 pixel wide texture
0 0.5 1 <- texture coordinate
| | |
V V v
+-----+-----+-----+-----+-----+
| | | | | |
| | | | | | <- texels
+-----+-----+-----+-----+-----+
A 6 pixel wide texture
0 0.5 1 <- texture coordinate
| | |
V V V
+-----+-----+-----+-----+-----+-----+
| | | | | | |
| | | | | | | <- texels
+-----+-----+-----+-----+-----+-----+
A 1 pixel wide texture
0 0.5 1 <- texture coordinate
| | |
V V V
+-----+
| |
| | <- texels
+-----+
If you use u = integerTextureCoordinate / width
for each texture coordinate you'd get these coordinates
0 0.25 0.5 0.75 <- u = intU / width;
| | | |
V V V V
+-----+-----+-----+-----+
| | | | |
| | | | | <- texels
+-----+-----+-----+-----+
Those coordinates point directly between texels.
But, the texture coords you want if you want to address specific texels are like this
0.125 0.375 0.625 0.875
| | | |
V V V V
+-----+-----+-----+-----+
| | | | |
| | | | | <- texels
+-----+-----+-----+-----+
Which you get from
u = (integerTextureCoord + .5) / width
来源:https://stackoverflow.com/questions/40574677/how-to-normalize-image-coordinates-for-texture-space-in-opengl