Making CIContext.render(CIImage, CVPixelBuffer) work with AVAssetWriter

前端 未结 2 1059
南方客
南方客 2021-01-03 06:45

I want to use Core Image for processing a bunch of CGImage objects and turning them into a QuickTime movie on macOS. The following code demonst

2条回答
  •  -上瘾入骨i
    2021-01-03 07:32

    After speaking with Apple Developer Technical Support it appears that:

    Core Image defers the rendering until the client requests the access to the frame buffer, i.e. CVPixelBufferLockBaseAddress.

    So, the solution is simply to do CVPixelBufferLockBaseAddress after calling CIContext.render as shown below:

    for frameNumber in 0 ..< frameCount {
        var pixelBuffer: CVPixelBuffer?
        guard let pixelBufferPool: CVPixelBufferPool = pixelBufferAdaptor.pixelBufferPool else { preconditionFailure() }
        precondition(CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &pixelBuffer) == kCVReturnSuccess)
    
        let ciImage = CIImage(cgImage: frameImage)
        context.render(ciImage, to: pixelBuffer!)
    
        precondition(CVPixelBufferLockBaseAddress(pixelBuffer!, []) == kCVReturnSuccess)
        defer { precondition(CVPixelBufferUnlockBaseAddress(pixelBuffer!, []) == kCVReturnSuccess) }
    
        let bytes = UnsafeBufferPointer(start: CVPixelBufferGetBaseAddress(pixelBuffer!)!.assumingMemoryBound(to: UInt8.self), count: CVPixelBufferGetDataSize(pixelBuffer!))
        precondition(bytes.contains(where: { $0 != 0 }))
    
        while !input.isReadyForMoreMediaData { Thread.sleep(forTimeInterval: 10 / 1000) }
        precondition(pixelBufferAdaptor.append(pixelBuffer!, withPresentationTime: CMTime(seconds: Double(frameNumber) * frameRate, preferredTimescale: 600)))
    }
    

提交回复
热议问题