I\'m trying to figure out texture mapping in OpenGL and I can\'t get a simple example to work.
The polygon is being drawn, though it\'s not textured but just a soli
I found the problem. My call to glEnable
was glEnable(GL_BLEND | GL_TEXTURE_2D)
. Using glGetError I saw I was getting a GL_INVALID_ENUM for this call, so I moved GL_TEXTURE_2D to its own enable function and bingo. I guess logical OR isn't allowed for glEnable?
First thing I'd check is the colour material setting, as mentioned by ShadowIce, then check your texture file to ensure it's a reasonable size (i.e. something like 256x256) and an RGB bitmap. If the file has even a slight problem it WILL NOT render correctly, no matter how you try.
Then, I'd stop trying to just debug that code and instead see what you have different to the tutorial on the NeHe website.
NeHe is always a good place to check if you're trying to do stuff in OpenGL. Textures are probably the hardest thing to get right, and they only get more difficult as the rest of your GL skills increase.
In your comments, you say your bitmap is 29x20 pixels. Afaik to generate a valid texture, OpenGL requires that the image size (on each dimension) be a power of 2. It doesn't need to be a square, it can be a rectangle though. You can overcome this by using some OpenGL extensions like GL_ARB_texture_rectangle.
My OpenGL is rusty, but I remember having same problems with with glTexImage2D
. Finally I managed to make it work, but I always had more luck with gluBuild2DMipmaps
so i ended up with
gluBuild2DMipmaps (
GL_TEXTURE_2D, type, i.width, i.height, type, GL_UNSIGNED_BYTE, i.data
);
which replaced
glTexImage2D (
GL_TEXTURE_2D, 0, type, i.width, i.height, 0, type, GL_UNSIGNED_BYTE, i.data
);
Some random ideas:
For error checking I recommend writing a small function that prints readable output for the most common results from glGetErrors and use that to find the line that creates the error. Another possibility would be to use something like GLIntercept, BuGLe or gDEBugger.
I'll put this here as I had the same issue and found another post explaining the issue. The iPhone does support GL_BGRA(GL_EXT_BGRA) but seemingly only as an input format and not as an internal format. So, if you change the glTexImage2D call to have an internal format of GL_RGBA then it works.
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_BGRA, GL_UNSIGNED_BYTE, &sprite1[54]);
I hope this helps someone else that stumbles upon this post.