Using GPUImage to filter a view live

自古美人都是妖i 提交于 2019-12-12 19:24:02

问题


I am trying to use GPUImage to filter a view as it is updated in a kind of iOS 7 style overlay. To do this I am running the following code on an NSTimer, however my NSLog is showing that [self.backgroundUIImage imageFromCurrentlyProcessedOutput] is returning (null) I know that the view self.background is working properly as it is also added to the view (self.finalImageView has also already been added to the view). I am not sure whether I'm going about this the right way at all as there is no real documentation on how to do this on the GPUImage github page. Does anybody know if it is possible to use GPUImage in this way?

if (!self.backgroundUIImage) {

        self.backgroundUIImage = [[GPUImageUIElement alloc] initWithView:self.background];

        GPUImageFastBlurFilter *fastBlur = [[GPUImageFastBlurFilter alloc] init];
        fastBlur.blurSize = 0.5;
        [self.backgroundUIImage addTarget:fastBlur];
        [self.backgroundUIImage update];

    }

    [self.backgroundUIImage update];

    NSLog(@"image : %@",[self.backgroundUIImage imageFromCurrentlyProcessedOutput]);

    self.finalImageView.image = [self.backgroundUIImage imageFromCurrentlyProcessedOutput];

    [self bringSubviewToFront:self.finalImageView];

EDIT 1

I have now looked at the FilterShowcase example code, and I'm therefore now trying to implement the live blur like so :

NSLog(@"regenerating view");

    if (!self.backgroundUIImage) {

        GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
        blendFilter.mix = 1.0;

        self.backgroundUIImage = [[GPUImageUIElement alloc] initWithView:self.background];

        NSLog(@"self.backgroundUIImage : %@",self.backgroundUIImage);

        self.filter = [[GPUImageFastBlurFilter alloc] init];
        self.filter.blurSize = 10;

        [self.filter addTarget:blendFilter];
        [self.backgroundUIImage addTarget:blendFilter];

        [blendFilter addTarget:self.finalImageView];

        __unsafe_unretained GPUImageUIElement *weakUIElementInput = self.backgroundUIImage;

        [self.filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){

            [weakUIElementInput update];

        }];

    }

    __unsafe_unretained GPUImageUIElement *weakUIElementInput = self.backgroundUIImage;

    [self.filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){

        [weakUIElementInput update];

    }];

    [self bringSubviewToFront:self.finalImageView];

Where the @properties are set up like so :

@property (nonatomic, strong) GPUImageView *finalImageView;
@property (nonatomic, strong) SMBouncingBubblesBackground *background;

@property (nonatomic, strong) GPUImageUIElement *backgroundUIImage;
@property (nonatomic, strong) GPUImageFastBlurFilter *filter;

This looks like it's not generating an image as I can see the background view through self.finalImageView. Some documentation for how to do this would be really great!

来源:https://stackoverflow.com/questions/18353298/using-gpuimage-to-filter-a-view-live

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!