Index Buffer Object and UV Coordinates don't play nice

核能气质少年 提交于 2019-12-04 05:43:47

问题


This generates the 25 main vertices.

    For x As Single = -1 To 1 Step 0.5F
        For y As Single = 1 To -1 Step -0.5F
            Dim pt1 As New Vector3(x, y, 0)
            tFloats.Add(pt1)
        Next
    Next

This is the indices, which makes up 16 tiles made of 32 triangles, I actually generate them but this is the first row:

        Dim inasd() As Integer = {
        0, 2, 10,
        2, 10, 12,
        10, 12, 20,
        12, 20, 22,
        20, 22, 30,
        22, 30, 32
        }

Now I'm trying to apply a texture to every single triangle, 1 texture per tile. 16 different textures.

Now my problem lies within the fact that when I use

GL.DrawRangeElements(PrimitiveType.Triangles, 0, indices.Length - 1, 6, DrawElementsType.UnsignedShort, New IntPtr(0))

Rather than going along the UV data and using it like (0,1,2) (3,4,5) (6,7,8) it instead follows the index data, and grabs UV cords like (0,2,10) (2,10,12) (10,12,20) so to counter that I did this:

        Dim UV_Data(12) As Vector2
        UV_Data(0) = New Vector2(0.0F, 1.0F)
        UV_Data(2) = New Vector2(0.0F, 0.0F)
        UV_Data(10) = New Vector2(1.0F, 1.0F)
        UV_Data(12) = New Vector2(1.0F, 0.0F)

This works great, on the first tile. On the second tile because it goes (10,12,20) it uses UV_Data(10) as the top left corner of the triangle which is the top right corner on the texture so it doesn't work. Is there anyway I can get rid of the indexing on the UV but not on the vertex? Because its just causing such a headache for me, otherwise what can I do?

EDIT:

The vertex data and UV Data ares stored in two separate buffers like this:

    GL.GenBuffers(1, FloatBuffer)
    GL.BindBuffer(BufferTarget.ArrayBuffer, FloatBuffer)
    GL.BufferData(BufferTarget.ArrayBuffer, New IntPtr(floats.Length * Vector3.SizeInBytes), floats, BufferUsageHint.StaticDraw)

    GL.GenBuffers(1, UVBuffer)
    GL.BindBuffer(BufferTarget.ArrayBuffer, UVBuffer)
    GL.BufferData(BufferTarget.ArrayBuffer, New IntPtr(UV_Data.Length * Vector2.SizeInBytes), UV_Data, BufferUsageHint.StaticDraw)

They get to the graphics card like this:

    GL.EnableVertexAttribArray(0)
    GL.BindBuffer(BufferTarget.ArrayBuffer, FloatBuffer)
    GL.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, False, 0, 0)

    GL.EnableVertexAttribArray(1)
    GL.BindBuffer(BufferTarget.ArrayBuffer, UVBuffer)
    GL.VertexAttribPointer(1, 2, VertexAttribPointerType.Float, False, 0, 0)

I also have the indices buffer:

    GL.BindBuffer(BufferTarget.ElementArrayBuffer, IndicesBuffer)

Then drawn with the Gl.DrawRangeElements code above.

My vertex shader:

#version 330 core

// Input vertex data, different for all executions of this shader.
layout(location = 0) in vec3 vertexPosition_modelspace;
layout(location = 1) in vec2 vertexUV;

// Output data ; will be interpolated for each fragment.
out vec2 UV;

// Values that stay constant for the whole mesh.

void main(){

// Output position of the vertex, in clip space : MVP * position
gl_Position.xyz = vertexPosition_modelspace;
gl_Position.w = 1;

// UV of the vertex. No special space for this one.
UV = vertexUV;
}

My fragment shader:

#version 330 core

// Interpolated values from the vertex shaders
in vec2 UV;

// Ouput data
out vec3 color;

// Values that stay constant for the whole mesh.
uniform sampler2D myTextureSampler;

void main(){

// Output color = color of the texture at the specified UV
color = texture2D( myTextureSampler, UV ).rgb;
}

EDIT 2: Is it possible to edit the UV data whilst rendering for example:

    Dim UV_Data(indices.Length) As Vector2
    For i As Integer = 0 To 1
        UV_Data(indices(i)) = New Vector2(0.0F, 1.0F)
        UV_Data(indices(i + 1)) = New Vector2(0.0F, 0.0F)
        UV_Data(indices(i + 2)) = New Vector2(1.0F, 1.0F)
        UV_Data(indices(i + 3)) = New Vector2(1.0F, 0.0F)
        'GL.BindBuffer(BufferTarget.ArrayBuffer, UVBuffer)
        GL.BufferData(BufferTarget.ArrayBuffer, New IntPtr(UV_Data.Length * Vector2.SizeInBytes), UV_Data, BufferUsageHint.DynamicDraw)
        GL.VertexAttribPointer(1, 2, VertexAttribPointerType.Float, False, 0, 0)

        'GL.BindBuffer(BufferTarget.ElementArrayBuffer, IndicesBuffer)
        GL.BindTexture(TextureTarget.Texture2D, mapTextures.Item(i))
        GL.DrawRangeElements(PrimitiveType.Triangles, 0, indices.Length - 1, 6, DrawElementsType.UnsignedShort, New IntPtr(i * 12))
    Next

回答1:


The index applies to vertex data and texture data and attributes. There is no way around this. For example in a simple 4 quad array, it can work like below.

var backgroundObj = [ -1,-1, -1, 1,-1,-1, 1,1,-1, -1,1,-1 ];

var backgroundObjTexCoords = [ 0.000000, 0.000000, 1.000000, 0.000000,
1.000000, 1.000000, 0.000000, 1.000000 ];

var backgroundObjIndices = [ 0,1,2, 0,2,3 ];

EDIT1:

,

If you take the opposite flow as in above image, the triangles will follow the order 012, 213, 234, etc. You need to generate vertices this way even if doing manually.



来源:https://stackoverflow.com/questions/21074552/index-buffer-object-and-uv-coordinates-dont-play-nice

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!