问题
I'm trying to save a screen region from opengl into a bitmap. I've tried using FreeImage, and SDL_Image, and they both require me to swap red and blue channels. Of course, that brings me to suspect that glReadPixels is the problem here... I have this sample code:
bool CaptureScreenRegionToFile ( uint32_t In_X, uint32_t In_Y, uint32_t In_Width, uint32_t In_Height, std::string In_Filename )
{
GLubyte* ImageData = ( GLubyte* ) malloc ( In_Width * In_Height * 3 );
glPixelStorei ( GL_PACK_ALIGNMENT, 1 );
glReadPixels ( In_X, In_Y, In_Width, In_Height, GL_RGB, GL_UNSIGNED_BYTE, ImageData );
if ( CheckError() == false )
{
free ( ImageData );
return false;
}
SDL_Surface *Surface;
// JTP TODO Known bug here. Red and blue are swapped, for some reason...
Surface = SDL_CreateRGBSurfaceFrom ( ImageData, In_Width, In_Height, 3 * 8, In_Width * 3, 0x00FF0000, 0x0000FF00, 0x000000FF, 0 );
SDL_SaveBMP ( Surface, In_Filename.c_str() );
SDL_FreeSurface ( Surface );
free ( ImageData );
return true;
}
Unless I swap the red and blue channels manually after the call to CreateRGBSurfaceFrom, its colors are going to be swapped on the BMP. Is glReadPixels supposed to do this? Am I calling it correctly? What's wrong here?
回答1:
glReadPixels
works perfectly as it is intented to. When you read back the data as GL_RGB
, you will get the following layout in memory
0 1 2 3 4 5 6 7 8
+---------------------+
|R|G|B|R|G|B|R|G|B|...|
+---------------------+
Now you use SDL_CreateRGBSurfaceFrom
with 24 bits per pixel. This will interpret every group of 3 bytes as one pixel. However, it will treat this as one big integer number, with the rmask
, gmask
, bmask
and `amask. parameters describing which bits belongs to which image channel.
For example, you use 0x00FF0000
as the red mask, meaning the bits 16 to 23 will contain read (counting from least significant bit, beginning with zero):
3 2| 1| 0| 0 bit
1 4| 6| 8| 0 number
+-----------------------------------+
|00000000|11111111|00000000|00000000|
+-----------------------------------+
byte 3 byte 2 byte 1 byte 0
However, you are most likely on a little endian machine, where the least significant byte is stored in the lowest memory position. That means your rmask
will be stored as:
0 1 2 byte number
+--------------+
+ 00 | 00 | FF |
+--------------+
R G B
so it doesn't match the format you read the pixel data from. Also note that your code is not portable, you would get different (the intented) result on a big-endian architecture.
UPDATE 1
If you want to write this code as protable as possible, without adding some endian-dependend #ifdefs
, you can't use a format with 3 bytes per pixel. However, you could use GL_UNSIGNED_INT_8_8_8_8
or GL_UNSIGNED_INT_8_8_8_8_REV
, in which each pixel is treated as a 32bit integer, and stored using the machine's native endianness. The _REV
variant will store the first component of your format (e.g. red in GL_RGBA
) in the least significant byte, the non-_REV
variant in the most significant byte. In either case, you can set the appropriate masks, and it will work independently of the endianness.
来源:https://stackoverflow.com/questions/38038075/glreadpixels-swaps-blue-and-red