glDrawArrays vs glDrawElements

后端 未结 3 834
野趣味
野趣味 2021-01-12 05:57

Ok so I\'m still struggling to get this to work. The important parts of my code are:

def __init__(self, vertices, normals, triangles):
    self.bufferVertice         


        
3条回答
  •  栀梦
    栀梦 (楼主)
    2021-01-12 06:37

    I have no experience with Python GL, but I think I spotted something. You use len(self.triangles) in the call to glDrawElements, so I suppose that gives you the number of indices in the triangles array. But why then using len(triangles) as size in glBufferData and not ADT.arrayByteCount like in the other calls. So your buffer is just too small, as it contains len(triangles) bytes, although triangles contains unsigned ints. If triangles really contains bytes (what I doubt) you would have to use GL_UNSIGNED_BYTE in glDrawElements.

    EDIT: According to your edits I got some more answers. Of course glDrawElements(GL_POINTS, ...) needs indices, too. It just uses every index to draw a point, instead of every three indices for a triangle. It's just that for points you often don't need glDrawElements, as you don't reuse vertices anyway, but you still need indices for it. It doesn't magically become a glDrawArrays call under the hood.

    And keep in mind, that the vertices array contains floats and glDrawArrays draws vertices, so you have to draw len(vertices)/3 vertices. Juts remember, an element is an index (of a single vertex), not a triangle and a vertex is 3 floats (or what you specified in glVertexPointer), not just one.

    But if your triangles array really contains tuples of 3 indices (and therefore len(triangles) is the triangle count and not the index count) you would have to draw 3*len(triangles) elements (indices). and if your vertices array contains vectors and not just floats, then drawing len(vertices) vertices in the glDrawArrays call is correct. It would therefore be nice to see their declarations to be sure.

提交回复
热议问题