Video as texture in OpenGLES2.0

笑着哭i 提交于 2020-01-01 19:20:32

问题


I want to place video as texture to object in OpenGLES 2.0 iOS. I create AVPlayer with AVPlayerItemVideoOutput, setting

NSDictionary *videoOutputOptions = [NSDictionary dictionaryWithObjectsAndKeys:
                                   [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
                                   [NSDictionary dictionary], kCVPixelBufferIOSurfacePropertiesKey,
                                   nil];
self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:videoOutputOptions];

Than I get CVPixelBufferRef for each moment of time

CMTime currentTime = [self.videoOutput itemTimeForHostTime:CACurrentMediaTime()];
CVPixelBufferRef buffer = [self.videoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:NULL];

Then i convert it to UIImage with this method

+ (UIImage *)imageWithCVPixelBufferUsingUIGraphicsContext:(CVPixelBufferRef)pixelBuffer
{
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);        
    int w = CVPixelBufferGetWidth(pixelBuffer);
    int h = CVPixelBufferGetHeight(pixelBuffer);
    int r = CVPixelBufferGetBytesPerRow(pixelBuffer);
    int bytesPerPixel = r/w;        
    unsigned char *bufferU = CVPixelBufferGetBaseAddress(pixelBuffer);
    UIGraphicsBeginImageContext(CGSizeMake(w, h));        
    CGContextRef c = UIGraphicsGetCurrentContext();        
    unsigned char* data = CGBitmapContextGetData(c);
    if (data) {
        int maxY = h;
        for(int y = 0; y < maxY; y++) {
            for(int x = 0; x < w; x++) {
                int offset = bytesPerPixel*((w*y)+x);
                data[offset] = bufferU[offset];     // R
                data[offset+1] = bufferU[offset+1]; // G
                data[offset+2] = bufferU[offset+2]; // B
                data[offset+3] = bufferU[offset+3]; // A
            }
        }
    }
    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    CFRelease(pixelBuffer);
    return image;
}

As result i got required frame from video:

After all i try to update texture with

- (void)setupTextureWithImage:(UIImage *)image
{
    if (_texture.name) {
        GLuint textureName = _texture.name;
        glDeleteTextures(1, &textureName);
    }

    NSError *error;
    _texture = [GLKTextureLoader textureWithCGImage:image.CGImage options:nil error:&error];
    if (error) {
        NSLog(@"Error during loading texture: %@", error);
    }
}

I call this method in GLKView's update method, but as result got black screen, only audio available.

Can anyone explain whats done wrong? Looks like i'm doing something wrong with textures...


回答1:


The issue is most likely somewhere else then the code you posted. To check the texture itself create a snapshot (a feature in Xcode) and see if you can see the correct texture there. Maybe your coordinates are incorrect or some parameters missing when displaying the textured object, could be you forgot to enable some attributes or the shaders are not present...

Since you got so far I suggest you first try to draw a colored square, then try to apply a texture (not from the video) to it until you get the correct result. Then implement the texture from video.

And just a suggestion since you are getting raw pixel data from the video you should consider creating only one texture and then use texture sub image function to update the texture directly with the data instead of doing some strange iterations and transformations to the image. The glTexSubImage2D will take your buffer pointer directly and do the update.




回答2:


I try to launch at device - and it's work fine.

Looks like that problem is that simulator not support some operations.



来源:https://stackoverflow.com/questions/27268663/video-as-texture-in-opengles2-0

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!