How to decode a H.264 frame on iOS by hardware decoding?

杀马特。学长 韩版系。学妹 提交于 2019-12-20 09:57:00

问题


I have been used ffmpeg to decode every single frame that I received from my ip cam. The brief code looks like this:

-(void) decodeFrame:(unsigned char *)frameData frameSize:(int)frameSize{
   AVFrame frame;
   AVPicture picture;
   AVPacket pkt;
   AVCodecContext *context;
   pkt.data = frameData;
   pat.size = frameSize;
   avcodec_get_frame_defaults(&frame);
   avpicture_alloc(&picture, PIX_FMT_RGB24, targetWidth, targetHeight);
   avcodec_decode_video2(&context, &frame, &got_picture, &pkt);
}

The code woks fine, but it's software decoding. I want to enhance the decoding performance by hardware decoding. After lots of research, I know it may be achieved by AVFoundation framework. The AVAssetReader class may help, but I can't figure out what's the next.Could anyone points out the following steps for me? Any help would be appreciated.


回答1:


iOS does not provide any public access directly to the hardware decode engine, because hardware is always used to decode H.264 video on iOS.

Therefore, session 513 gives you all the information you need to allow frame-by-frame decoding on iOS. In short, per that session:

  • Generate individual network abstraction layer units (NALUs) from your H.264 elementary stream. There is much information on how this is done online. VCL NALUs (IDR and non-IDR) contain your video data and are to be fed into the decoder.
  • Re-package those NALUs according to the "AVCC" format, removing NALU start codes and replacing them with a 4-byte NALU length header.
  • Create a CMVideoFormatDescriptionRef from your SPS and PPS NALUs via CMVideoFormatDescriptionCreateFromH264ParameterSets()
  • Package NALU frames as CMSampleBuffers per session 513.
  • Create a VTDecompressionSessionRef, and feed VTDecompressionSessionDecodeFrame() with the sample buffers
    • Alternatively, use AVSampleBufferDisplayLayer, whose -enqueueSampleBuffer: method obviates the need to create your own decoder.



回答2:


Edit:

This link provide more detail explanation on how to decode h.264 step by step: stackoverflow.com/a/29525001/3156169

Original answer:

I watched the session 513 "Direct Access to Video Encoding and Decoding" in WWDC 2014 yesterday, and got the answer of my own question.

The speaker says:

We have Video Toolbox(in iOS 8). Video Toolbox has been there on OS X for a while, but now it's finally populated with headers on iOS.This provides direct access to encoders and decoders.

So, there is no way to do hardware decoding frame by frame in iOS 7, but it can be done in iOS 8.

Is there anyone figure out how to directly access to video encoding and decoding frame by frame in iOS 8?



来源:https://stackoverflow.com/questions/25197169/how-to-decode-a-h-264-frame-on-ios-by-hardware-decoding

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!