Memory Leak in CMSampleBufferGetImageBuffer

前端 未结 2 555
旧时难觅i
旧时难觅i 2021-01-15 06:34

I\'m getting a UIImage from a CMSampleBufferRef video buffer every N video frames like:

- (void)imageFromVideoBuffer:(void(^)(UIImage* imag         


        
相关标签:
2条回答
  • 2021-01-15 06:48

    We were experiencing a similar issue in an app we created, where we are processing each frame for feature keypoints with OpenCV, and sending off a frame every couple of seconds. After a while of running we would end up with quite a few memory pressure messages.

    We managed to rectify this by running our processing code in it's own auto release pool like so (jpegDataFromSampleBufferAndCrop does something similar to what you are doing, with added cropping):

    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
    {
            @autoreleasepool {
    
                if ([self.lastFrameSentAt timeIntervalSinceNow] < -kContinuousRateInSeconds) {
    
                    NSData *imageData = [self jpegDataFromSampleBufferAndCrop:sampleBuffer];
    
                    if (imageData) {
                        [self processImageData:imageData];
                    }
    
                    self.lastFrameSentAt = [NSDate date];
    
                    imageData = nil;
                }
            }
        }
    }
    
    0 讨论(0)
  • 2021-01-15 06:59

    I can confirm that this memory leak still exists on iOS 9.2. (I've also posted on the Apple Developer Forum.)

    I get the same memory leak on iOS 9.2. I've tested dropping EAGLContext by using MetalKit and MLKDevice. I've tested using different methods of CIContext like drawImage, createCGImage and render but nothing seem to work.

    It is very clear that this is a bug as of iOS 9. Try it out your self by downloading the example app from Apple (see below) and then run the same project on a device with iOS 8.4, then on a device with iOS 9.2 and pay attention to the memory gauge in Xcode.

    Download https://developer.apple.com/library/ios/samplecode/AVBasicVideoOutput/Introduction/Intro.html#//apple_ref/doc/uid/DTS40013109

    Add this to the APLEAGLView.h:20

    @property (strong, nonatomic) CIContext* ciContext;
    

    Replace APLEAGLView.m:118 with this

    [EAGLContext setCurrentContext:_context];
     _ciContext = [CIContext contextWithEAGLContext:_context];
    

    And finaly replace APLEAGLView.m:341-343 with this

    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);  
    
        @autoreleasepool  
        {  
            CIImage* sourceImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];  
            CIFilter* filter = [CIFilter filterWithName:@"CIGaussianBlur" keysAndValues:kCIInputImageKey, sourceImage, nil];  
            CIImage* filteredImage = filter.outputImage;  
    
            [_ciContext render:filteredImage toCVPixelBuffer:pixelBuffer];  
        }  
    
    glBindRenderbuffer(GL_RENDERBUFFER, _colorBufferHandle);  
    
    0 讨论(0)
提交回复
热议问题