I have downloaded the sample code GLPaint from developer.Apple website to draw pictures on a Canvas using OpenGL.
I have done the changes in GLPaint application to meet m
First, GLPaint is a terrible application to start with if you are not at all familiar with OpenGL ES. It's way too complex of a sample application to start with.
That said, I can describe what I use for recording H.264 video from OpenGL ES in my GPUImage framework. If you care to see the full implementation of this, look at the GPUImageMovieWriter class. Note that my implementation of this is based on OpenGL ES 2.0, so you may need to make some adaptations to have this work in OpenGL ES 1.1 (used by GLPaint).
You'll use an AVAssetWriter for this. In order to get decent recording performance, you'll need to provide frames to the writer in BGRA format, not the RGBA you get from reading the screen using glReadPixels()
. In my case, I used a color-swizzling shader to convert from RGBA to BGRA before reading, but you don't have that option with OpenGL ES 1.1. I'm not sure what you can do to work around this and still get decent recording speeds (with RGBA frames, I was seeing 3-5 FPS recording, where with BGRA I get a solid 30 FPS).
I set up the writer using code like the following:
frameData = (GLubyte *) malloc((int)videoSize.width * (int)videoSize.height * 4);
NSError *error = nil;
assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL fileType:AVFileTypeAppleM4V error:&error];
if (error != nil)
{
NSLog(@"Error: %@", error);
}
NSMutableDictionary * outputSettings = [[NSMutableDictionary alloc] init];
[outputSettings setObject: AVVideoCodecH264 forKey: AVVideoCodecKey];
[outputSettings setObject: [NSNumber numberWithInt: videoSize.width] forKey: AVVideoWidthKey];
[outputSettings setObject: [NSNumber numberWithInt: videoSize.height] forKey: AVVideoHeightKey];
assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
assetWriterVideoInput.expectsMediaDataInRealTime = YES;
// You need to use BGRA for the video in order to get realtime encoding. I use a color-swizzling shader to line up glReadPixels' normal RGBA output with the movie input's BGRA.
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt:videoSize.width], kCVPixelBufferWidthKey,
[NSNumber numberWithInt:videoSize.height], kCVPixelBufferHeightKey,
nil];
assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:assetWriterVideoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
[assetWriter addInput:assetWriterVideoInput];
and begin recording with the following:
startTime = [NSDate date];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
I grab and encode a color-swizzled frame using the following:
CVPixelBufferRef pixel_buffer = NULL;
CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &pixel_buffer);
if ((pixel_buffer == NULL) || (status != kCVReturnSuccess))
{
return;
}
else
{
CVPixelBufferLockBaseAddress(pixel_buffer, 0);
GLubyte *pixelBufferData = (GLubyte *)CVPixelBufferGetBaseAddress(pixel_buffer);
glReadPixels(0, 0, videoSize.width, videoSize.height, GL_RGBA, GL_UNSIGNED_BYTE, pixelBufferData);
}
// May need to add a check here, because if two consecutive times with the same value are added to the movie, it aborts recording
CMTime currentTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate:startTime],120);
if(![assetWriterPixelBufferInput appendPixelBuffer:pixel_buffer withPresentationTime:currentTime])
{
NSLog(@"Problem appending pixel buffer at time: %lld", currentTime.value);
}
else
{
}
CVPixelBufferUnlockBaseAddress(pixel_buffer, 0);
CVPixelBufferRelease(pixel_buffer);
and then when I'm done with it, I finish it off with the following:
[assetWriterVideoInput markAsFinished];
[assetWriter finishWriting];
Again, you can see this in action in the above-linked framework. You might be able to modify this to use with OpenGL ES 1.1 and the GLPaint sample, but it might not have the best recording performance. As I said at the start, GLPaint is a horrible place for a newcomer to OpenGL ES to begin, so you might want to try with something a lot simpler first.