问题
I've gone through the obj-c GPUImage framework, and as per the example in the documentation, I added the following snippet with intention to display filtered live video:
CGRect mainScreenFrame = [[UIScreen mainScreen] applicationFrame];
GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"PositionColor"];
GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:mainScreenFrame];
// Add the view somewhere so it's visible
[self.view addSubview:filteredVideoView];
[videoCamera addTarget:customFilter];
[customFilter addTarget:filteredVideoView];
[videoCamera startCameraCapture];
It works, but instead of video i get a single still image. I've reviewed it and looked at examples and can't really pinpoint why it isn't working correctly. Any help would be much appreciated.
Thanks!
回答1:
You need to hang on to your GPUImageVideoCamera somewhere. If the above is placed in a method, and ARC is enabled for the project, the GPUImageVideoCamera instance will be deallocated the instant that method exits. This will terminate the video capture and could lead to other unsettling artifacts.
Make a GPUImageVideoCamera instance variable in your class, instead, and use that for the above. You're also going to need that so that you can pause and stop the camera when done.
来源:https://stackoverflow.com/questions/21999138/gpuimage-displaying-image-instead-of-live-video