问题
Using CoreGraphics
on iOS is very easy to use, but it is possible to get the output of CoreGraphics
and put it into OpenGL
Textures?
The final goal is to use CGContextDrawPDFPage
to render very performant pdf's and write it into a specific texture id with
OpenGL.glBindTexture(GL_TEXTURE_2D, TextureNativeId);
It does look like CoreGraphics
is not able to render directly into a specific "native texture id".
回答1:
Yes, you can, by rendering your Core Graphics content to a bitmap context and uploading that to a texture. The following is code that I use to draw a UIImage to a Core Graphics context, but you could replace the CGContextDrawImage()
portion with your own drawing code:
GLubyte *imageData = (GLubyte *) calloc(1, (int)pixelSizeOfImage.width * (int)pixelSizeOfImage.height * 4);
CGColorSpaceRef genericRGBColorspace = CGColorSpaceCreateDeviceRGB();
CGContextRef imageContext = CGBitmapContextCreate(imageData, (int)pixelSizeOfImage.width, (int)pixelSizeOfImage.height, 8, (int)pixelSizeOfImage.width * 4, genericRGBColorspace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGContextDrawImage(imageContext, CGRectMake(0.0, 0.0, pixelSizeOfImage.width, pixelSizeOfImage.height), [newImageSource CGImage]);
CGContextRelease(imageContext);
CGColorSpaceRelease(genericRGBColorspace);
glBindTexture(GL_TEXTURE_2D, outputTexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, (int)pixelSizeOfImage.width, (int)pixelSizeOfImage.height, 0, GL_BGRA, GL_UNSIGNED_BYTE, imageData);
This assumes that you've created your texture using code like the following:
glActiveTexture(GL_TEXTURE0);
glGenTextures(1, &outputTexture);
glBindTexture(GL_TEXTURE_2D, outputTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
// This is necessary for non-power-of-two textures
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glBindTexture(GL_TEXTURE_2D, 0);
For rapidly changing content, you might want to look into iOS 5.0's texture caches (CVOpenGLESTextureCacheCreateTextureFromImage()
and the like), which might let you render directly to the bytes for your texture. However, I've found that the overhead for creating and rendering to a texture with a texture cache makes this slightly slower for rendering a single image, so if you don't need to continually update this the code above is probably your fastest route.
来源:https://stackoverflow.com/questions/9903864/render-coregraphics-to-opengl-texture-on-ios