Using glMultiDrawElements in 64bit OS

隐身守侯 提交于 2019-12-05 15:42:04

The simple reason is, that glMultiDrawElements doesn't expect an array of integer offsets (32bit on your platform), but an array of pointers (64bit on your platform), interpreted as buffer offsets.

But you are just casting the array of (or pointer to) integers to an array of (or pointer to) pointers, which won't work, as the function now just reinterprets your n consecutive 32bit values as n consecutive 64bit values. Of course it works for glDrawElements because you're just casting a single integer into a single pointer, which essentially converts your 32bit value into a 64bit value.

What you need to do is not cast your pointer/array, but each individual value in this offset array:

std::vector<void*> pointers(mesh().faces().size());
for(size_t i=0; i<pointers.size(); ++i)
    pointers[i] = static_cast<void*>(iOffset_[i]);
glMultiDrawElements( GL_LINE_LOOP, fCount_, GL_UNSIGNED_INT, 
                     &pointers.front(), mesh().faces().size() );

Or better, just store your offsets as pointers instead of integers right from the start.

datenwolf

You're running into the problems I thoroughly dissected in https://stackoverflow.com/a/8284829/524368

I suggest you follow my suggestion at the very end of the answer, and don't try to cast your numbers into something the compiler thinks is a pointer, but cast the function signature into something that takes a number.

Note that in the case of glMultiDrawElements the first indirection does not go into a VBO, but into client memory. So the signature to cast to are eg.

void myglMultiDrawElementsOffset(GLenum mode,
    const GLsizei * count,
    GLenum type,
    const uintptr_t * indices,
    GLsizei  primcount);
标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!