gpuimage

GPUImage - How to chain filters with GPUImagePicture?

允我心安 提交于 2019-12-11 14:23:46
问题 I want to achieve the same effect as in "Harris Corner detection" in GPUImage FilterShowcase but with a GPUImagePicture instead of a GPUImageVideoCamera . I can't get the filter chain to work. Here's my current code: -(UIImage*)addHarrisCornerToImage:(UIImage*)inputImage { GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage]; GPUImageHarrisCornerDetectionFilter *cornerDetectionFilter = [[GPUImageHarrisCornerDetectionFilter alloc] init]; [

GPUImage: GPUImageToneCurveFilter Not Working

浪尽此生 提交于 2019-12-11 14:11:23
问题 I have managed to use the GPUImage in my app now and tried putting filters in my photos on button click, but there another issue. GPUImageFilter *selectedFilter; if (sender.tag == 1) { selectedFilter = [[GPUImageFilter alloc] init]; } else if (sender.tag == 2) { selectedFilter = [[GPUImageThresholdEdgeDetection alloc] init]; } else if (sender.tag == 3) { selectedFilter = [[GPUImageSketchFilter alloc] init]; } else if (sender.tag == 4) { selectedFilter = [[GPUImageToneCurveFilter alloc]

GPUImage Kuwahara filter on iPhone 4S

扶醉桌前 提交于 2019-12-11 13:28:46
问题 I'm using Brad Larson's GPUImage framework. However when I'm trying to apply kuwahara filter with filter radius 5.0f , I'm getting artifacts on an iPhone 4S. (works fine on higher performance devices) Source image size was 2048x2048px. By reading original developer's comments I understood that there's a kind of watchdog timer which fires when something takes too long to run on the GPU. So my question is , what is the maximum possible resolution for an iPhone 4S I can apply Kuwahara filter

GPUImage blend filters

扶醉桌前 提交于 2019-12-11 12:57:07
问题 I'm trying to apply a blend filters to 2 images. I've recently updated GPUImage to the last version. To make things simple I've modified the example SimpleImageFilter. Here is the code: UIImage * image1 = [UIImage imageNamed:@"PGSImage_0000.jpg"]; UIImage * image2 = [UIImage imageNamed:@"PGSImage_0001.jpg"]; twoinputFilter = [[GPUImageColorBurnBlendFilter alloc] init]; sourcePicture1 = [[GPUImagePicture alloc] initWithImage:image1 ]; sourcePicture2 = [[GPUImagePicture alloc] initWithImage

GPUImage swift iOS 7

一世执手 提交于 2019-12-11 11:19:23
问题 Does anybody released app with Swift and GPUImage for iOS 7+ ? When I trying to post an app I'm getting error (it isn't my screen, but I getting the same) App worked perfect on all devices and simulators for ios 7, also I sent it as adhoc via testflight and it still worked, but I can't release it now. 回答1: So the root problem here is the support for embedded frameworks. To make an Objective-C framework like GPUImage available to Swift projects, you have to build it as a module, which first

Generating UIImage from GPUImage video frame

南楼画角 提交于 2019-12-11 05:41:21
问题 I'm trying to generate a UIImage from a video frame captured by GPUImage . I've done a lot of AVFoundation video work, but i'm new to using GPUImage . I've subclassed GPUImageVideoCamera and added this method, but the UIImage is always nil. If anyone can tell me where i've gone so horribly wrong, i'd be very appreciative! - (void)processVideoSampleBuffer:( CMSampleBufferRef )sampleBuffer { [super processVideoSampleBuffer:sampleBuffer]; // to let GPUImage do it's processing first if (!self

GPUImage crash ('over release framebuffer') when using Blend Filter

只谈情不闲聊 提交于 2019-12-11 03:12:14
问题 I'm getting a headache because of the crash I experience when trying to apply Blend filter to an image and displaying it. What I want to do is simply put an overlay image onto another image. Here's my code: - (GPUImageOutput<GPUImageInput> *)myFilter { GPUImageFilterGroup *filtersGroup = [GPUImageFilterGroup new]; // Saturation GPUImageSaturationFilter *saturationFilter = [GPUImageSaturationFilter new]; saturationFilter.saturation = 0.0; [filtersGroup addFilter:saturationFilter]; // Noise

GPUImageView inside SKScene as SKNode material - Playing transparent video on ARKit

a 夏天 提交于 2019-12-11 01:58:15
问题 In Project -A- I used GPUImageView to display Video (recorded on greenscreen) with transparency. Using the GPUImageChromaKeyBlendFilter, and so on. and works Superb. Another project -B- based on ARKIT shows me in the space a plain with VIDEO and it also works fine using SKVideoNode and AVPlayer. Now the question is to combine it all together in one :) So in space I want to display Video but with transparency ... Unfortunately, I can not render a GPUImageView on any SpriteKit element, and then

imageAvailableCallback never called in basic GPUImage2 camera setup

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-10 23:35:00
问题 I have followed the basic setup instructions on GPUImage2's github for filtering live video and capturing an image from video just so I can setup a basic camera. When the user taps a button, I try to capture the image from the filter using this code specifically: let pictureOutput = PictureOutput() pictureOutput.encodedImageFormat = .JPEG pictureOutput.imageAvailableCallback = {image in // Do something with the image self.previewImageView.image = image } self.filter! --> pictureOutput For

How to blur image using glsl shader without squares?

放肆的年华 提交于 2019-12-10 22:56:34
问题 I want to blur image with Gaussian blur algorithm. And I use the following shaders: Vertex shader attribute vec4 position; attribute vec4 inputTextureCoordinate; const int GAUSSIAN_SAMPLES = 9; uniform float texelWidthOffset; uniform float texelHeightOffset; varying vec2 textureCoordinate; varying vec2 blurCoordinates[GAUSSIAN_SAMPLES]; void main() { gl_Position = position; textureCoordinate = inputTextureCoordinate.xy; // Calculate the positions for the blur int multiplier = 0; vec2 blurStep