Ok so I\'m still struggling to get this to work. The important parts of my code are:
def __init__(self, vertices, normals, triangles):
self.bufferVertice
I have no experience with Python GL, but I think I spotted something. You use len(self.triangles)
in the call to glDrawElements
, so I suppose that gives you the number of indices in the triangles array. But why then using len(triangles)
as size in glBufferData
and not ADT.arrayByteCount
like in the other calls. So your buffer is just too small, as it contains len(triangles)
bytes, although triangles contains unsigned ints. If triangles really contains bytes (what I doubt) you would have to use GL_UNSIGNED_BYTE
in glDrawElements
.
EDIT: According to your edits I got some more answers. Of course glDrawElements(GL_POINTS, ...)
needs indices, too. It just uses every index to draw a point, instead of every three indices for a triangle. It's just that for points you often don't need glDrawElements
, as you don't reuse vertices anyway, but you still need indices for it. It doesn't magically become a glDrawArrays
call under the hood.
And keep in mind, that the vertices
array contains floats and glDrawArrays
draws vertices, so you have to draw len(vertices)/3
vertices. Juts remember, an element is an index (of a single vertex), not a triangle and a vertex is 3 floats (or what you specified in glVertexPointer
), not just one.
But if your triangles
array really contains tuples of 3 indices (and therefore len(triangles)
is the triangle count and not the index count) you would have to draw 3*len(triangles)
elements (indices). and if your vertices
array contains vectors and not just floats, then drawing len(vertices)
vertices in the glDrawArrays call is correct. It would therefore be nice to see their declarations to be sure.
In my experience, the Python OpenGL wrapper is very buggy once you start using some of the more advanced OpenGL calls. Many calls seem to cause a crash for no reason and sometimes work if you replace them with an equivalent sequence of different calls...I switched programming languages for OpenGL instead of having to deal with these issues.
The reason why
glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, ADT.voidDataPointer(self.triangles))
works and
glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, 0)
doesn't is because PyOpenGL expects None
as the void pointer, rather than 0. Be careful when using OpenGL examples written in C, because they use (void*)0
as the void pointer, which isn't interpreted correctly as a pointer by PyOpenGL, which instead treats 0 as a non-void value.
Instead, you should use
glDrawElements(GL_TRIANGLES, len(self.triangles) , GL_UNSIGNED_SHORT, None)
(See also https://stackoverflow.com/a/14365737/478380)