Applying Effect to iPhone Camera Preview “Video”

前端 未结 2 1851
太阳男子
太阳男子 2020-12-29 00:22

My goal is to write a custom camera view controller that:

  1. Can take photos in all four interface orientations with both the back and, when available, front came
相关标签:
2条回答
  • 2020-12-29 01:00

    A fundamentally better approach would be to use OpenGL to handle as much of the image-related heavy lifting for you (as I see you're trying in your latest attempt). However, even then you might have issues with building up frames to be processed.

    While it seems strange that you'd be running into memory accumulation when processing frames (in my experience, you just stop getting them if you can't process them fast enough), Grand Central Dispatch queues can get jammed up if they are waiting on I/O.

    Perhaps a dispatch semaphore would let you throttle the addition of new items to the processing queues. For more on this, I highly recommend Mike Ash's "GCD Practicum" article, where he looks at optimizing an I/O bound thumbnail processing operation using dispatch semaphores.

    0 讨论(0)
  • 2020-12-29 01:15

    To prevent the memory issues, simply create an autorelease pool in captureOutput:didOutputSampleBuffer:fromConnection:.

    This makes sense since imageFromSampleBuffer: returns an autoreleased UIImage object. Plus it frees up any autoreleased objects created by image processing code right away.

    // Delegate routine that is called when a sample buffer was written
    - (void)captureOutput:(AVCaptureOutput *)captureOutput 
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
    fromConnection:(AVCaptureConnection *)connection
    { 
        NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
    
        // Create a UIImage from the sample buffer data
        UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
    
        < Add your code here that uses the image >
    
        [pool release];
    }
    

    My testing has shown that this will run without memory warnings on an iPhone 4 or iPod Touch (4th gen) even if requested FPS is very high (e.g. 60) and image processing is very slow (e.g. 0.5+ secs).

    OLD SOLUTION:

    As Brad pointed out, Apple recommends image processing be on a background thread so as to not interfere with the UI responsiveness. I didn't notice much lag in this case, but best practices are best practices, so use the above solution with autorelease pool instead of running this on the main dispatch queue / main thread.

    To prevent the memory issues, simply use the main dispatch queue instead of creating a new one.

    This also means that you don't have to switch to the main thread in captureOutput:didOutputSampleBuffer:fromConnection: when you want to update the UI.

    In setupCaptureSession, change FROM:

    // Configure your output.
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);
    

    TO:

    // we want our dispatch to be on the main thread
    [output setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
    
    0 讨论(0)
提交回复
热议问题