AVCapture capturing and getting framebuffer at 60 fps in iOS 7

前端 未结 3 911
無奈伤痛
無奈伤痛 2020-12-23 02:43

I\'m developping an app which requires capturing framebuffer at as much fps as possible. I\'ve already figured out how to force iphone to capture at 60 fps but



        
相关标签:
3条回答
  • 2020-12-23 02:53

    I have developed the same function for Swift 2.0. I post here the code for who could need it:

    // Set your desired frame rate
    func setupCamera(maxFpsDesired: Double = 120) {
    var captureSession = AVCaptureSession()
        captureSession.sessionPreset = AVCaptureSessionPreset1920x1080
        let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
        do{ let input = try AVCaptureDeviceInput(device: backCamera)
            captureSession.addInput(input) }
        catch { print("Error: can't access camera")
            return
        }
        do {
            var finalFormat = AVCaptureDeviceFormat()
            var maxFps: Double = 0
            for vFormat in backCamera!.formats {
                var ranges      = vFormat.videoSupportedFrameRateRanges as!  [AVFrameRateRange]
                let frameRates  = ranges[0]
                /*
                     "frameRates.maxFrameRate >= maxFps" select the video format
                     desired with the highest resolution available, because
                     the camera formats are ordered; else
                     "frameRates.maxFrameRate > maxFps" select the first
                     format available with the desired fps 
                */
                if frameRates.maxFrameRate >= maxFps && frameRates.maxFrameRate <= maxFpsDesired {
                    maxFps = frameRates.maxFrameRate
                    finalFormat = vFormat as! AVCaptureDeviceFormat
                }
            }
            if maxFps != 0 {
               let timeValue = Int64(1200.0 / maxFps)
               let timeScale: Int64 = 1200
               try backCamera!.lockForConfiguration()
               backCamera!.activeFormat = finalFormat
               backCamera!.activeVideoMinFrameDuration = CMTimeMake(timeValue, timeScale)
               backCamera!.activeVideoMaxFrameDuration = CMTimeMake(timeValue, timeScale)              backCamera!.focusMode = AVCaptureFocusMode.AutoFocus
               backCamera!.unlockForConfiguration()
            }
        }
        catch {
             print("Something was wrong")
        }
        let videoOutput = AVCaptureVideoDataOutput()
        videoOutput.alwaysDiscardsLateVideoFrames = true
        videoOutput.videoSettings = NSDictionary(object: Int(kCVPixelFormatType_32BGRA),
            forKey: kCVPixelBufferPixelFormatTypeKey as String) as [NSObject : AnyObject]
        videoOutput.setSampleBufferDelegate(self, queue: dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL))
        if captureSession.canAddOutput(videoOutput){
            captureSession.addOutput(videoOutput) }
        let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        view.layer.addSublayer(previewLayer)
        previewLayer.transform =  CATransform3DMakeRotation(-1.5708, 0, 0, 1);
        previewLayer.frame = self.view.bounds
        previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
        self.view.layer.addSublayer(previewLayer)
        captureSession.startRunning()
    }
    
    0 讨论(0)
  • 2020-12-23 03:02

    Had the same problem. Fixed by using this function after [AVCaptureSession addInput:cameraDeviceInput]. Somehow I could not change the framerate on my iPad pro before capture session was started. So at first I changed video format after the device was added to the capture session.

    - (void)switchFormatWithDesiredFPS:(CGFloat)desiredFPS
    {
        BOOL isRunning = _captureSession.isRunning;
    
        if (isRunning)  [_captureSession stopRunning];
    
        AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
        AVCaptureDeviceFormat *selectedFormat = nil;
        int32_t maxWidth = 0;
        AVFrameRateRange *frameRateRange = nil;
    
        for (AVCaptureDeviceFormat *format in [videoDevice formats]) {
    
            for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
    
                CMFormatDescriptionRef desc = format.formatDescription;
                CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(desc);
                int32_t width = dimensions.width;
    
                if (range.minFrameRate <= desiredFPS && desiredFPS <= range.maxFrameRate && width >= maxWidth) {
    
                    selectedFormat = format;
                    frameRateRange = range;
                    maxWidth = width;
                }
            }
        }
    
        if (selectedFormat) {
    
            if ([videoDevice lockForConfiguration:nil]) {
    
                NSLog(@"selected format:%@", selectedFormat);
                videoDevice.activeFormat = selectedFormat;
                videoDevice.activeVideoMinFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
                videoDevice.activeVideoMaxFrameDuration = CMTimeMake(1, (int32_t)desiredFPS);
                [videoDevice unlockForConfiguration];
            }
        }
    
        if (isRunning) [_captureSession startRunning];
    }
    
    0 讨论(0)
  • 2020-12-23 03:10

    I am getting samples at 60 fps on the iPhone 5 and 120 fps on the iPhone 5s, both when doing real time motion detection in captureOutput and when saving the frames to a video using AVAssetWriter.

    You have to set thew AVCaptureSession to a format that supports 60 fps:

    AVsession = [[AVCaptureSession alloc] init];
    
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceInput *capInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    if (capInput) [AVsession addInput:capInput];
    
    for(AVCaptureDeviceFormat *vFormat in [videoDevice formats] ) 
    {
        CMFormatDescriptionRef description= vFormat.formatDescription;
        float maxrate=((AVFrameRateRange*)[vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;
    
        if(maxrate>59 && CMFormatDescriptionGetMediaSubType(description)==kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
        {
            if ( YES == [videoDevice lockForConfiguration:NULL] ) 
            {
               videoDevice.activeFormat = vFormat;
               [videoDevice setActiveVideoMinFrameDuration:CMTimeMake(10,600)];
               [videoDevice setActiveVideoMaxFrameDuration:CMTimeMake(10,600)];
               [videoDevice unlockForConfiguration];
               NSLog(@"formats  %@ %@ %@",vFormat.mediaType,vFormat.formatDescription,vFormat.videoSupportedFrameRateRanges);
            }
         }
    }
    
    prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: AVsession];
    prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [self.view.layer addSublayer: prevLayer];
    
    AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
    dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);
    [videoOut setSampleBufferDelegate:self queue:videoQueue];
    
    videoOut.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
    videoOut.alwaysDiscardsLateVideoFrames=YES;
    
    if (videoOut)
    {
        [AVsession addOutput:videoOut];
        videoConnection = [videoOut connectionWithMediaType:AVMediaTypeVideo];
    }
    

    Two other comment if you want to write to a file using AVAssetWriter. Don't use the pixelAdaptor, just ad the samples with

    [videoWriterInput appendSampleBuffer:sampleBuffer]
    

    Secondly when setting up the assetwriter use

    [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                       outputSettings:videoSettings 
                                     sourceFormatHint:formatDescription];
    

    The sourceFormatHint makes a difference in writing speed.

    0 讨论(0)
提交回复
热议问题