How can we get H.264 encoded video stream from iPhone Camera?

后端 未结 2 1689
梦如初夏
梦如初夏 2021-02-06 10:03

I am using following to get video sample buffer:

- (void) writeSampleBufferStream:(CMSampleBufferRef)sampleBuffer ofType:(NSString *)mediaType

2条回答
  •  感情败类
    2021-02-06 10:27

    You can only get raw video images in either BGRA or YUV color formats from AVFoundation. However, when you write those frames to an mp4 via AVAssetWriter, they will be encoded using H264 encoding.

    A good example with code on how to do that is RosyWriter

    Note that after each AVAssetWriter write, you will know that one complete H264 NAL was written to a mp4. You could write code that reads a complete H264 NAL after each write by AVAssetWriter, which is going to give you access to an H264 encoded frame. It might take a bit to get it right with decent speed, but it is doable( I did it successfully).

    By the way, in order to successfully decode these encoded video frames, you will need H264 SPS and PPS information which is located in a different place in the mp4 file. In my case, I actually create couple of test mp4 files, and then manually extracted those out. Since those don't change, unless you change the H264 encoded specs, you can use them in your code.

    Check my post to SPS values for H 264 stream in iPhone to see some of the SPS/PPS I used in my code.

    Just a final note, in my case I had to stream h264 encoded frames to another endpoint for decoding/viewing; so my code had to do this fast. In my case, it was relatively fast; but eventually I switched to VP8 for encoding/decoding just because it was way faster because everything was done in memory without file reading/writing.

    Good luck, and hopefully this info helps.

提交回复
热议问题