问题
Using C++, I'm trying to load a texture into OpenGL using DevIL. After scrounging around for different code segments, I have a bit of code done (shown below), but it doesn't seem to work completely.
Loading a texture (Part of a Texture2D class):
void Texture2D::LoadTexture(const char *file_name)
{
unsigned int image_ID;
ilInit();
iluInit();
ilutInit();
ilutRenderer(ILUT_OPENGL);
image_ID = ilutGLLoadImage((char*)file_name);
sheet.texture_ID = image_ID;
sheet.width = ilGetInteger(IL_IMAGE_WIDTH);
sheet.height = ilGetInteger(IL_IMAGE_HEIGHT);
}
This compiles and works fine. I do realise that I should only do the ilInit(), iluInit(), and ilutInit() once, but if I remove those lines the program instantly breaks upon loading any image (compiles fine, but errors on runtime).
Displaying the texture in OpenGL (Part of the same class):
void Texture2D::Draw()
{
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);
glPushMatrix();
u = v = 0;
// this is the origin point (the position of the button)
VectorXYZ point_TL; // Top Left
VectorXYZ point_BL; // Bottom Left
VectorXYZ point_BR; // Bottom Right
VectorXYZ point_TR; // Top Right
/* For the sake of simplicity, I've removed the code calculating the 4 points of the Quad. Assume that they are found correctly. */
glColor3f(1,1,1);
// bind the appropriate texture frame
glBindTexture(GL_TEXTURE_2D, sheet.texture_ID);
// draw the image as a quad the size of the first loaded image
glEnable(GL_TEXTURE_2D);
glBegin(GL_QUADS);
glTexCoord2f (0, 0);
glVertex3f (point_TL.x, point_TL.y, point_TL.z); // Top Left
glTexCoord2f (0, 1);
glVertex3f (point_BL.x, point_BL.y, point_BL.z); // Bottom Left
glTexCoord2f (1, 1);
glVertex3f (point_BR.x, point_BR.y, point_BR.z); // Bottom Right
glTexCoord2f (1, 0);
glVertex3f (point_TR.x, point_TR.y, point_TR.z); // Top Right
glEnd();
glPopMatrix();
glDisable(GL_BLEND);
glDisable(GL_TEXTURE_2D);
}
Currently, the quad shows up, but its completely white (the background colour it's given). The image I'm loading exists and is loaded fine (verified using the loaded size values).
Another few things I should note: 1) I am using a depth buffer. I've heard this doesn't go well with GL_BLEND? 2) I would really like to use the ilutGLLoadImage function. 3) I appreciate example code, as I'm a newbie to openGL and DevIL as a whole.
回答1:
Yes, you have the problem. There might be issues with ilutGLLoadImage().
Try doing things manually:
- Load the image using ilLoadImage
- Generate the OpenGL texture handle using the glGenTextures
- Upload the image to OpenGL glTextImage2D and ilGetData()
See this link for a working solution
http://r3dux.org/tag/ilutglloadimage/
I know, this solution seems to be "a little" complicated, but nobody knows jow much time you would spend fighting with this bug hidden deep in the DevIL.
Another way of fixing things: check you GL texture setup code. Anything in the filtering can be a reason for GL_INVALID_OPERATION.
We've a lot of times into the "White texture" issue while programming the old ATI cards.
Oh! The biggest guess: Non-power-of-two textures. Is you texture file 2^N by 2^N or something different ?
To use non-rectangular textures you just have to use GL extensions.
And the other one: are you using the textures in the same thread or in the other ? Remember that you should glGenTextures() and glBindTexture()/glBegin/glEnd in the same thread.
来源:https://stackoverflow.com/questions/10707230/loading-textures-in-c-opengl-using-devil