gpuimage

How to adjust brightness and contrast of an image by GPUImage?

╄→гoц情女王★ 提交于 2019-12-24 02:01:44
问题 I wrote a method to filter an image with brightness factor and contrast factor as following: - (UIImage*)image:(UIImage*)image withBrightness:(float)brightness contrast:(float)contrast{ GPUImagePicture *imagePicture = [[GPUImagePicture alloc] initWithImage:image]; GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc] init]; GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc] init]; [imagePicture addTarget:brightnessFilter]; [brightnessFilter

GPUImage's GPUImageOpacityFilter not behaving as expected, doesn't change alpha channel

时间秒杀一切 提交于 2019-12-23 17:11:35
问题 I'm trying to do an Overlay Blend of a stock image with the output of the camera feed where the stock image has less than 100% opacity. I figured I could just place a GPUImageOpacityFilter in the filter stack and everything would be fine: GPUImageVideoCamera -> MY_GPUImageOverlayBlendFilter GPUImagePicture -> GPUImageOpacityFilter (Opacity 0.1f) -> MY_GPUImageOverlayBlendFilter MY_GPUImageOverlayBlendFilter -> GPUImageView But what that resulted in wasn't a 0.1f alpha version of

GPUImage apply filter to a buffer of images

前提是你 提交于 2019-12-23 05:10:51
问题 In GPUImage there are some filters that works only for a stream of frames from a camera, for instance the low pass filter, or the high pass filter, but there are plenty of them. I'm trying to create a buffer of UIImages that with a fixed timerate make possible to apply those filters also between just 2 images, and that for each pair of image produces a single filtered image. Something like that: FirstImage+SecondImage-->FirstFilteredImage SecondImage+ThirdImage-->SecondFilteredImage I've

GPUImage apply filter to a buffer of images

丶灬走出姿态 提交于 2019-12-23 05:10:03
问题 In GPUImage there are some filters that works only for a stream of frames from a camera, for instance the low pass filter, or the high pass filter, but there are plenty of them. I'm trying to create a buffer of UIImages that with a fixed timerate make possible to apply those filters also between just 2 images, and that for each pair of image produces a single filtered image. Something like that: FirstImage+SecondImage-->FirstFilteredImage SecondImage+ThirdImage-->SecondFilteredImage I've

No audio in video recording (using GPUImage) after initializing The Amazing Audio Engine

本小妞迷上赌 提交于 2019-12-23 03:32:17
问题 I'm using two third party tools in my project. One is "The Amazing Audio Engine". I use this for audio filters. The other is GPUImage, or more specifically, GPUImageMovieWriter. When I record videos, I merge an audio recording with the video. This works fine. However, sometimes I do not use The Amazing Audio Engine and just record a normal video using GPUImageMovieWriter. The problem is, even just after initializing The Amazing Audio Engine, the video has only a fraction of a second of audio

Unable to edit video using GPUImage

回眸只為那壹抹淺笑 提交于 2019-12-23 03:07:05
问题 I have created video using AVFoundation and now I want to edit it via GPUImage framework. I have set all the setting as per mention here. After seeing his example of "SimpleVideoFileFilter" I have just copied his code and replace my Assets URL for Video. Here is the code. movieFile = [[GPUImageMovie alloc] initWithURL:player.contentURL]; pixellateFilter = [[GPUImagePixellateFilter alloc] init]; [movieFile addTarget:pixellateFilter]; NSString *pathToMovie = [NSHomeDirectory()

GPUImageVideoCamera save video on ios 7

倾然丶 夕夏残阳落幕 提交于 2019-12-23 03:01:34
问题 I'm trying to achieve the square video recording like 300*300 so I choose GPUImage but its not working on IOS 7 and giving errors like [UIView nextAvailableTextureIndex]: unrecognized selector sent to instance the error starts when we build the even the sample code when trying to save the GPUImageVideoCamera some times its stucks at [movieWriter startRecording]; is the GPUImage compatible with ios 7 or we have made some changes ? here is the code - (void)viewDidLoad { [super viewDidLoad];

GPUImageVideoCamera save video on ios 7

梦想与她 提交于 2019-12-23 03:01:30
问题 I'm trying to achieve the square video recording like 300*300 so I choose GPUImage but its not working on IOS 7 and giving errors like [UIView nextAvailableTextureIndex]: unrecognized selector sent to instance the error starts when we build the even the sample code when trying to save the GPUImageVideoCamera some times its stucks at [movieWriter startRecording]; is the GPUImage compatible with ios 7 or we have made some changes ? here is the code - (void)viewDidLoad { [super viewDidLoad];

GPUImage displaying image instead of live video

青春壹個敷衍的年華 提交于 2019-12-22 18:49:10
问题 I've gone through the obj-c GPUImage framework, and as per the example in the documentation, I added the following snippet with intention to display filtered live video: CGRect mainScreenFrame = [[UIScreen mainScreen] applicationFrame]; GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack]; videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait; GPUImageFilter *customFilter =

videoMinFrameDuration is Deprecated

瘦欲@ 提交于 2019-12-22 14:03:56
问题 When I have updated Xcode from 4.6 to 5.1 `'videoMinnFrameDuration' is deprecated in ios7 - (void)setFrameRate:(NSInteger)frameRate; { _frameRate = frameRate; if (_frameRate > 0) { for (AVCaptureConnection *connection in videoOutput.connections) { if ([connection respondsToSelector:@selector(setVideoMinFrameDuration:)]) connection.videoMinFrameDuration = CMTimeMake(1,_frameRate); 回答1: For one thing, you're using an outdated version of GPUImage, as this is has been fixed in the framework code