问题
I am trying to use GPUImage to filter a view as it is updated in a kind of iOS 7 style overlay. To do this I am running the following code on an NSTimer, however my NSLog is showing that [self.backgroundUIImage imageFromCurrentlyProcessedOutput]
is returning (null) I know that the view self.background is working properly as it is also added to the view (self.finalImageView has also already been added to the view). I am not sure whether I'm going about this the right way at all as there is no real documentation on how to do this on the GPUImage github page. Does anybody know if it is possible to use GPUImage in this way?
if (!self.backgroundUIImage) {
self.backgroundUIImage = [[GPUImageUIElement alloc] initWithView:self.background];
GPUImageFastBlurFilter *fastBlur = [[GPUImageFastBlurFilter alloc] init];
fastBlur.blurSize = 0.5;
[self.backgroundUIImage addTarget:fastBlur];
[self.backgroundUIImage update];
}
[self.backgroundUIImage update];
NSLog(@"image : %@",[self.backgroundUIImage imageFromCurrentlyProcessedOutput]);
self.finalImageView.image = [self.backgroundUIImage imageFromCurrentlyProcessedOutput];
[self bringSubviewToFront:self.finalImageView];
EDIT 1
I have now looked at the FilterShowcase example code, and I'm therefore now trying to implement the live blur like so :
NSLog(@"regenerating view");
if (!self.backgroundUIImage) {
GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init];
blendFilter.mix = 1.0;
self.backgroundUIImage = [[GPUImageUIElement alloc] initWithView:self.background];
NSLog(@"self.backgroundUIImage : %@",self.backgroundUIImage);
self.filter = [[GPUImageFastBlurFilter alloc] init];
self.filter.blurSize = 10;
[self.filter addTarget:blendFilter];
[self.backgroundUIImage addTarget:blendFilter];
[blendFilter addTarget:self.finalImageView];
__unsafe_unretained GPUImageUIElement *weakUIElementInput = self.backgroundUIImage;
[self.filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
[weakUIElementInput update];
}];
}
__unsafe_unretained GPUImageUIElement *weakUIElementInput = self.backgroundUIImage;
[self.filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
[weakUIElementInput update];
}];
[self bringSubviewToFront:self.finalImageView];
Where the @properties are set up like so :
@property (nonatomic, strong) GPUImageView *finalImageView;
@property (nonatomic, strong) SMBouncingBubblesBackground *background;
@property (nonatomic, strong) GPUImageUIElement *backgroundUIImage;
@property (nonatomic, strong) GPUImageFastBlurFilter *filter;
This looks like it's not generating an image as I can see the background view through self.finalImageView. Some documentation for how to do this would be really great!
来源:https://stackoverflow.com/questions/18353298/using-gpuimage-to-filter-a-view-live