GPUImage apply filter to a buffer of images

丶灬走出姿态 提交于 2019-12-23 05:10:03

问题


In GPUImage there are some filters that works only for a stream of frames from a camera, for instance the low pass filter, or the high pass filter, but there are plenty of them.
I'm trying to create a buffer of UIImages that with a fixed timerate make possible to apply those filters also between just 2 images, and that for each pair of image produces a single filtered image. Something like that:
FirstImage+SecondImage-->FirstFilteredImage
SecondImage+ThirdImage-->SecondFilteredImage
I've found that filters that works with frames use a GPUImageBuffer, that is a subclass of GPUImageFilter (most probably just to inherit some methods and protocols) that loads a passthrough fragment shader. From what I understood this is a buffer that keeps incoming frames but already "texturized", textures are generated by binding the texture in the current context.
I've found also a -conserveMemoryForNextFrame that sounds good for what I want to achieve, but I didn't get how is working.
Is it possible to do that? in which method images are converted in texture?


回答1:


I made something close about what I'd like to achieve, but in first instance I must say that probably I've misunderstood some aspects about current filters functionalities.
I thought that some filters could make some operations taking the time variable into account in their shader. That's because when I saw the low pass filter and hight pass filter I've instantly thought about time. The reality seems to be different, they take into account time but it doesn't seems that this affect the filtering operations.
Since I'm developing by myself a time-lapse application, that saves single images and that reassemble them into a different timeline to make a video without audio, I imagined that a filters function of time could be fun to apply to the subsequent frames. This is the reason about why I posted this question.
Now the answer: to apply a double input filter to still images you must do like in the snippet:

    [sourcePicture1 addTarget:twoinputFilter];
    [sourcePicture1 processImage];
    [sourcePicture2 addTarget:twoinputFilter];
    [sourcePicture2 processImage];
    [twoinputFilter useNextFrameForImageCapture];
    UIImage * image = [twoinputFilter imageFromCurrentFramebuffer];

If you forget to call the -useNextFrameForImageCapture the returned image will be nil, due to the buffer reuse.
Not happy I thought that maybe in the future the good Brad will make something like this, so I've created a GPUImagePicture subclass, that instead of returning kCMTimeIvalid to the appropriate methods returns a new ivar that contains the frame CMTime called -frameTime.

@interface GPUImageFrame : GPUImagePicture
@property (assign, nonatomic) CMTime frameTime;
@end

@implementation GPUImageFrame

- (BOOL)processImageWithCompletionHandler:(void (^)(void))completion;
{
    hasProcessedImage = YES;

    //    dispatch_semaphore_wait(imageUpdateSemaphore, DISPATCH_TIME_FOREVER);

    if (dispatch_semaphore_wait(imageUpdateSemaphore, DISPATCH_TIME_NOW) != 0)
    {
        return NO;
    }

    runAsynchronouslyOnVideoProcessingQueue(^{
        for (id<GPUImageInput> currentTarget in targets)
        {
            NSInteger indexOfObject = [targets indexOfObject:currentTarget];
            NSInteger textureIndexOfTarget = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];

            [currentTarget setCurrentlyReceivingMonochromeInput:NO];
            [currentTarget setInputSize:pixelSizeOfImage atIndex:textureIndexOfTarget];
            [currentTarget setInputFramebuffer:outputFramebuffer atIndex:textureIndexOfTarget];
            [currentTarget newFrameReadyAtTime:_frameTime atIndex:textureIndexOfTarget];
        }

        dispatch_semaphore_signal(imageUpdateSemaphore);

        if (completion != nil) {
            completion();
        }
    });

    return YES;
}

- (void)addTarget:(id<GPUImageInput>)newTarget atTextureLocation:(NSInteger)textureLocation;
{
    [super addTarget:newTarget atTextureLocation:textureLocation];

    if (hasProcessedImage)
    {
        [newTarget setInputSize:pixelSizeOfImage atIndex:textureLocation];
        [newTarget newFrameReadyAtTime:_frameTime atIndex:textureLocation];
    }
}


来源:https://stackoverflow.com/questions/20551434/gpuimage-apply-filter-to-a-buffer-of-images

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!