AVCapture appendSampleBuffer

后端 未结 2 979
一向
一向 2021-02-01 10:46

I am going insane with this one - have looked everywhere and tried anything and everything I can thinks of.

Am making an iPhone app that uses AVFoundation - specifically

相关标签:
2条回答
  • 2021-02-01 11:26

    You need AVAssetWriterInputPixelBufferAdaptor, here is the code to create it :

        // Create dictionary for pixel buffer adaptor
    NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil];
    
    // Create pixel buffer adaptor
    m_pixelsBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:assetWriterInput sourcePixelBufferAttributes:bufferAttributes];
    

    And the code to use it :

    // If ready to have more media data
    if (m_pixelsBufferAdaptor.assetWriterInput.readyForMoreMediaData) {
        // Create a pixel buffer
        CVPixelBufferRef pixelsBuffer = NULL;
        CVPixelBufferPoolCreatePixelBuffer(NULL, m_pixelsBufferAdaptor.pixelBufferPool, &pixelsBuffer);
    
        // Lock pixel buffer address
        CVPixelBufferLockBaseAddress(pixelsBuffer, 0);
    
        // Create your function to set your pixels data in the buffer (in your case, fill with your finalImage data)
        [self yourFunctionToPutDataInPixelBuffer:CVPixelBufferGetBaseAddress(pixelsBuffer)];
    
        // Unlock pixel buffer address
        CVPixelBufferUnlockBaseAddress(pixelsBuffer, 0);
    
        // Append pixel buffer (calculate currentFrameTime with your needing, the most simplest way is to have a frame time starting at 0 and increment each time you write a frame with the time of a frame (inverse of your framerate))
        [m_pixelsBufferAdaptor appendPixelBuffer:pixelsBuffer withPresentationTime:currentFrameTime];
    
        // Release pixel buffer
        CVPixelBufferRelease(pixelsBuffer);
    }
    

    And don't forget to release your pixelsBufferAdaptor.

    0 讨论(0)
  • 2021-02-01 11:28

    I do it by using CMSampleBufferCreateForImageBuffer() .

    OSStatus ret = 0;
    CMSampleBufferRef sample = NULL;
    CMVideoFormatDescriptionRef videoInfo = NULL;
    CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid;
    timingInfo.presentationTimeStamp = pts;
    timingInfo.duration = duration;
    
    ret = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixel, &videoInfo);
    if (ret != 0) {
        NSLog(@"CMVideoFormatDescriptionCreateForImageBuffer failed! %d", (int)ret);
        goto done;
    }
    ret = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixel, true, NULL, NULL,
                                             videoInfo, &timingInfo, &sample);
    if (ret != 0) {
        NSLog(@"CMSampleBufferCreateForImageBuffer failed! %d", (int)ret);
        goto done;
    }
    
    0 讨论(0)
提交回复
热议问题