lodepng on windows of ogl texture crash

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-11 17:51:20

问题


I'm using lodePNG to load texture from a png file but the program crashes when I load the texture.

char* filename = "texture.png";
unsigned width, height;
std::vector<unsigned char>image;
GLuint texture[1];
//decode
unsigned error = lodepng::decode(image, width, height, filename);
if(error) std::cerr << "decoder error " << error << ": " << lodepng_error_text(error) << std::endl;
glBindTexture(GL_TEXTURE_2D, texture[0]);

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0,
GL_RGBA, GL_UNSIGNED_BYTE, &image[0]);

I used VS 2010 to compile and there are no errors, but the crash happens when I call glTexImage2D. My system is Windows 7, with support for OGL 3.3 on the graphics card.

I have another system, which is a Mac OSX 10.6, and I'm always porting the code, and there are no problems with it on the Mac at all. Is there a suggested fix I could make so it will load the texture correctly on windows?

Here is the site where I got the lodePNG files: http://lodev.org/lodepng/


回答1:


In your code, you declare GLuint texture[1], an array of one texture handle. You then bind to this uninitialized texture handle on this line:

glBindTexture(GL_TEXTURE_2D, texture[0]);

This is incorrect. You are binding to a texture handle that has not yet been initialized. Instead, what you need to do is generate the texture handle by calling glGenTextures. THEN you may bind to the texture. So try this:

Add this:

glGenTextures(1,&texture[0]);

Before this line:

glBindTexture(GL_TEXTURE_2D, texture[0]);

After calling glGenTextures, your handle (texture[0]) should be a non-zero value.



来源:https://stackoverflow.com/questions/13827162/lodepng-on-windows-of-ogl-texture-crash

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!