Do I have to use the OpenGL data types (GLint, CLchar, …) for a cross platform game?

前端 未结 2 370
悲&欢浪女
悲&欢浪女 2020-12-14 16:06

I have a short question. Why does OpenGL come with its own datatypes for standard types like int, unsigned int, char, and so on? And do I have to use them instead of the bui

相关标签:
2条回答
  • 2020-12-14 16:51

    I'm not an expert of OpenGL, but usually frameworks/platforms such as OpenGL, Qt, etc. define their own datatypes so that the meaning and the capacity of the underlying datatype remains the same across different OSes. Usually this behavior is obtained using C/C++ preprocessor macros, but for what concerns GLuint, it seems to be just a typedef in gl.h:

    typedef unsigned int GLuint;
    

    So the answer is yes. You should use the framework's datatypes to ensure a good portability of your code within that framework across OSes.

    0 讨论(0)
  • 2020-12-14 16:58

    For example the OpenGL equivalent to unsigned int is GLuint

    No it isn't, and that's exactly why you should use OpenGL's data types when interfacing with OpenGL.

    GLuint is not "equivalent" to unsigned int. GLuint is required to be 32 bits in size. It is always 32-bits in size. unsigned int might be 32-bits in size. It might be 64-bits. You don't know, and C isn't going to tell you (outside of sizeof).

    These datatypes will be defined for each platform, and they may be defined differently for different platforms. You use them because, even if they are defined differently, they will always come out to the same sizes. The sizes that OpenGL APIs expect and require.

    0 讨论(0)
提交回复
热议问题