问题
I'm trying to do an Overlay Blend
of a stock image with the output of the camera feed where the stock image has less than 100% opacity. I figured I could just place a GPUImageOpacityFilter
in the filter stack and everything would be fine:
- GPUImageVideoCamera -> MY_GPUImageOverlayBlendFilter
- GPUImagePicture -> GPUImageOpacityFilter (Opacity 0.1f) -> MY_GPUImageOverlayBlendFilter
- MY_GPUImageOverlayBlendFilter -> GPUImageView
But what that resulted in wasn't a 0.1f alpha version of GPUImagePicture blended into the GPUImageVideoCamera, it resulted in kinda softening the colors/contrast of GPUImagePicture and blending that. So I did some searching and on a suggestion tried getting a UIImage out of the GPUImageOpacity filter using imageFromCurrentlyProcessedOutput
and sending that into the BlendFilter:
- GPUImagePicture -> MY_GPUImageOpacityFilter (Opacity 0.1f)
- [MY_GPUImageOpacityFilter imageFromCurrentlyProcessedOutput]-> MY_alphaedImage
- GPUImagePicture (MY_alphaedImage) -> MY_GPUImageOverlayBlendFilter
- GPUImageVideoCamera -> MY_GPUImageOverlayBlendFilter
- MY_GPUImageOverlayBlendFilter -> GPUImageView
And that worked exactly as I would expect. So, why do I have to imageFromCurrentlyProcessedOutput
, shouldn't that just happen in line? Here are the code snippits of the two scenarios above:
first one:
//Create the GPUPicture
UIImage *image = [UIImage imageNamed:@"someFile"];
GPUImagePicture *textureImage = [[[GPUImagePicture alloc] initWithImage:image] autorelease];
//Create the Opacity filter w/0.5 opacity
GPUImageOpacityFilter *opacityFilter = [[[GPUImageOpacityFilter alloc] init] autorelease];
opacityFilter.opacity = 0.5f
[textureImage addTarget:opacityFilter];
//Create the blendFilter
GPUImageFilter *blendFilter = [[[GPUImageOverlayBlendFilter alloc] init] autorelease];
//Point the cameraDevice's output at the blendFilter
[self._videoCameraDevice addTarget:blendFilter];
//Point the opacityFilter's output at the blendFilter
[opacityFilter addTarget:blendFilter];
[textureImage processImage];
//Point the output of the blendFilter at our previewView
GPUImageView *filterView = (GPUImageView *)self.previewImageView;
[blendFilter addTarget:filterView];
second one:
//Create the GPUPicture
UIImage *image = [UIImage imageNamed:@"someFile"];
GPUImagePicture *textureImage = [[[GPUImagePicture alloc] initWithImage:image] autorelease];
//Create the Opacity filter w/0.5 opacity
GPUImageOpacityFilter *opacityFilter = [[[GPUImageOpacityFilter alloc] init] autorelease];
opacityFilter.opacity = 0.5f
[textureImage addTarget:opacityFilter];
//Process the image so we get a UIImage with 0.5 opacity of the original
[textureImage processImage];
UIImage *processedImage = [opacityFilter imageFromCurrentlyProcessedOutput];
GPUImagePicture *processedTextureImage = [[[GPUImagePicture alloc] initWithImage:processedImage] autorelease];
//Create the blendFilter
GPUImageFilter *blendFilter = [[[GPUImageOverlayBlendFilter alloc] init] autorelease];
//Point the cameraDevice's output at the blendFilter
[self._videoCameraDevice addTarget:blendFilter];
//Point the opacityFilter's output at the blendFilter
[processedTextureImage addTarget:blendFilter];
[processedTextureImage processImage];
//Point the output of the blendFilter at our previewView
GPUImageView *filterView = (GPUImageView *)self.previewImageView;
[blendFilter addTarget:filterView];
来源:https://stackoverflow.com/questions/14470786/gpuimages-gpuimageopacityfilter-not-behaving-as-expected-doesnt-change-alpha