问题
I'm using/targetting OpenGL 4.20, GSLSL 4.20, core. Using C.
I'm exploring graphics with OpenGL and have decided to make a tile based game. I'm at the point where i'd like to actually draw a bunch of tiles, and am trying to use glDrawElements to draw individual triangles.
Relevant code:
const float vertexPositions[] = {
-1.0f, -0.8f, //0
-0.8f, -0.8f, //1
-0.6f, -0.8f, //2
-0.4f, -0.8f, //3
-0.2f, -0.8f, //4
0.0f, -0.8f, //5
-1.0f, -1.0f, //6
-0.8f, -1.0f, //7
-0.6f, -1.0f, //8
-0.4f, -1.0f, //9
-0.2f, -1.0f, //10
0.0f, -1.0f, //11
//The following are the texture coordinates.
0.00f, 1.00f,
1.00f, 1.00f,
0.00f, 1.00f,
1.00f, 1.00f,
0.00f, 1.00f,
1.00f, 1.00f,
0.00f, 0.00f,
1.00f, 0.00f,
0.00f, 0.00f,
1.00f, 0.00f,
0.00f, 0.00f,
1.00f, 0.00f,
};
const GLubyte indices[] = {
0, 1, 6,
1, 7, 6,
1, 2, 7,
2, 8, 7,
2, 3, 8,
3, 9, 8,
3, 4, 9,
4, 10, 9,
4, 5, 10,
5, 11, 10,
};
with
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, 0, 0);
glVertexAttribPointer(1, 2, GL_FLOAT, GL_FALSE, 0, (void *)96);
glDrawElements(GL_TRIANGLES, 30, GL_UNSIGNED_BYTE, indices);
(There are other pieces, but it's mostly formatting/attribute enabling etc. I'll provide more if needed).
Basically what i get is:
0 1 2 3 4 5
+--+--+--+--+--+
|b | d|b |d |b |
+--+--+--+--+--+
6 7 8 9 10 11
and what i want is:
0 1 2 3 4 5
+--+--+--+--+--+
|b |b |b |b |b |
+--+--+--+--+--+
6 7 8 9 10 11
Where "b" is the correct orientation of the texture/square and "d" is the reverse/reflected.
I realize that since the vertex coordinates are shared this is going to happen, but from my own thoughts/some posts below it seems like it is unavoidable. Either I conserve vertices and have reflected textures, or I end up sending 3*2*number_of_tiles vertices, which i feel defeats the purpose of glDrawElements.
Resources that haven't helped:
These are things I've found that are related, but seem (to me) to be either deprecated usage/not answered/not quite what I'm asking. Feel free to point me back to one of these posts if it in fact does answer my question.
This seems to say that glDrawElements is useless for a cube, and that you must have individual vertices making up faces. Does this apply to a 2d grid as well?
The answer to this question says (among other things) "Two squares with the same vertex positions, but different texture coordinates must be represented with different quads." but I don't understand what he's getting at.
This one I thought was close. In the answer: The first link is broken/access denied. The second link has an answer that basically says the vertices must be distinct. Is this required? In the third link (second code snippet) they seem to just generate all 36 vertices. For one cube this is probably ok, but for n cubes..? Am i missing the point of his example, maybe?
Is what I'm requesting even possible? Am I using completely incorrect functions for what I'm trying to achieve? "UR DOIN IT RONG"?
I've thought of re-arranging the coordinates so it looks like:
0 2 4 6 8 10
+--+--+--+--+--+
| | | | | |
+--+--+--+--+--+
1 3 5 7 9 11
But I didn't see how that would fix the issue, since vertices are still shared.
Thanks, and feel free to correct me/my question. First one, so I expect it to need revising.
回答1:
The second link has an answer that basically says the vertices must be distinct. Is this required?
Yes. You want each individual quad to have texture coordinates that go from 0 to 1. Which means that the positions that make up the right side of the first quad can't use the same texture coordinates as the positions that make up the left side of the second quad.
And since you cannot have the same position with two different texture coordinates, you must duplicate your positions.
来源:https://stackoverflow.com/questions/13170037/how-to-use-gldrawelements-while-keeping-texture-coordinates-correct