I have recently migrated from a 32bit environment to a 64bit one, and it has gone smoothly apart from one problem: glMultiDrawElements
uses some arrays that do not work without some tweaking under a 64bit OS.
glMultiDrawElements( GL_LINE_LOOP, fCount_, GL_UNSIGNED_INT,
reinterpret_cast< const GLvoid** >( iOffset_ ),
mesh().faces().size() );
I am using VBOs for both the vertices and vertex indices. fCount_
and iOffset_
are arrays of GLsizei
. Since a buffer is bound to GL_ELEMENT_ARRAY_BUFFER
, iOffset_
's elements are used as byte offsets from the VBO beginning. This works perfectly under a 32bit OS.
If I change glMultiDrawElements
to glDrawElements
and put it into a loop, it works fine on both platforms:
int offset = 0;
for ( Sy_meshData::Faces::ConstIterator i = mesh().faces().constBegin();
i != mesh().faces().constEnd(); ++i ) {
glDrawElements( GL_LINE_LOOP, i->vertexIndices.size(), GL_UNSIGNED_INT,
reinterpret_cast< const GLvoid* >( sizeof( GLsizei ) * offset ) );
offset += i->vertexIndices.size();
}
I think what I am seeing is OpenGL reading 64bit chunks of iOffset_
leading to massive numbers, but glMultiDrawElements
does not support any type wider than 32bit (GL_UNSIGNED_INT
), so I'm not sure how to correct it.
Has anyone else had this situation and solved it? Or am I handling this entirely wrong and was just lucky on a 32bit OS?
Update
Swapping out my existing code for:
typedef void ( *testPtr )( GLenum mode, const GLsizei* count, GLenum type,
const GLuint* indices, GLsizei primcount );
testPtr ptr = (testPtr)glMultiDrawElements;
ptr( GL_LINE_LOOP, fCount_, GL_UNSIGNED_INT, iOffset_, mesh().faces().size() );
Results in exactly the same result.
The simple reason is, that glMultiDrawElements
doesn't expect an array of integer offsets (32bit on your platform), but an array of pointers (64bit on your platform), interpreted as buffer offsets.
But you are just casting the array of (or pointer to) integers to an array of (or pointer to) pointers, which won't work, as the function now just reinterprets your n consecutive 32bit values as n consecutive 64bit values. Of course it works for glDrawElements
because you're just casting a single integer into a single pointer, which essentially converts your 32bit value into a 64bit value.
What you need to do is not cast your pointer/array, but each individual value in this offset array:
std::vector<void*> pointers(mesh().faces().size());
for(size_t i=0; i<pointers.size(); ++i)
pointers[i] = static_cast<void*>(iOffset_[i]);
glMultiDrawElements( GL_LINE_LOOP, fCount_, GL_UNSIGNED_INT,
&pointers.front(), mesh().faces().size() );
Or better, just store your offsets as pointers instead of integers right from the start.
You're running into the problems I thoroughly dissected in https://stackoverflow.com/a/8284829/524368
I suggest you follow my suggestion at the very end of the answer, and don't try to cast your numbers into something the compiler thinks is a pointer, but cast the function signature into something that takes a number.
Note that in the case of glMultiDrawElements the first indirection does not go into a VBO, but into client memory. So the signature to cast to are eg.
void myglMultiDrawElementsOffset(GLenum mode,
const GLsizei * count,
GLenum type,
const uintptr_t * indices,
GLsizei primcount);
来源:https://stackoverflow.com/questions/8719287/using-glmultidrawelements-in-64bit-os