What I'm trying to do is load a texture into hardware from a single channel data array and use it's alpha channel to draw text onto an object. I am using opengl 4.
If I try to do this using a 4 channel RGBA texture it works perfectly fine but for whatever reason when I try and load in a single channel only I get a garbled image and I can't figure out why. I create the texture by combing texture bitmap data for a series of glyphs with the following code into a single texture:
int texture_height = max_height * new_font->num_glyphs; int texture_width = max_width; new_texture->datasize = texture_width * texture_height; unsigned char* full_texture = new unsigned char[new_texture->datasize]; // prefill texture as transparent for (unsigned int j = 0; j < new_texture->datasize; j++) full_texture[j] = 0; for (unsigned int i = 0; i < glyph_textures.size(); i++) { // set height offset for glyph new_font->glyphs[i].height_offset = max_height * i; for (unsigned int j = 0; j < new_font->glyphs[i].height; j++) { int full_disp = (new_font->glyphs[i].height_offset + j) * texture_width; int bit_disp = j * new_font->glyphs[i].width; for (unsigned int k = 0; k < new_font->glyphs[i].width; k++) { full_texture[(full_disp + k)] = glyph_textures[i][bit_disp + k]; } } }
Then I load the texture data calling:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texture->x, texture->y, 0, GL_RED, GL_UNSIGNED_BYTE, reinterpret_cast<void*>(full_texture));
My fragment shader executes the following code:
#version 330 uniform sampler2D texture; in vec2 texcoord; in vec4 pass_colour; out vec4 out_colour; void main() { float temp = texture2D(texture, texcoord).r; out_colour = vec4(pass_colour[0], pass_colour[1], pass_colour[2], temp); }
I get an image that I can tell is generated from the texture but it is terribly distorted and I'm unsure why. Btw I'm using GL_RED because GL_ALPHA was removed from Opengl 4. What really confuses me is why this works fine when I generate a 4 RGBA texture from the glyphs and then use it's alpha channel??