h.264

H.264 codec explained [closed]

旧时模样 提交于 2019-12-20 10:03:13
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed last year . I am making a app which supports video calls and I am looking for a tutorial/doc explaining the structure of the h.264 codec. I want to be able to package the stream, wrap it in datagrams, send and unpack on the receiving side. Any suggestions/reading materials? 回答1: What do you mean by structure? If you are

How to decode a H.264 frame on iOS by hardware decoding?

拜拜、爱过 提交于 2019-12-20 09:57:18
问题 I have been used ffmpeg to decode every single frame that I received from my ip cam. The brief code looks like this: -(void) decodeFrame:(unsigned char *)frameData frameSize:(int)frameSize{ AVFrame frame; AVPicture picture; AVPacket pkt; AVCodecContext *context; pkt.data = frameData; pat.size = frameSize; avcodec_get_frame_defaults(&frame); avpicture_alloc(&picture, PIX_FMT_RGB24, targetWidth, targetHeight); avcodec_decode_video2(&context, &frame, &got_picture, &pkt); } The code woks fine,

How to decode a H.264 frame on iOS by hardware decoding?

杀马特。学长 韩版系。学妹 提交于 2019-12-20 09:57:00
问题 I have been used ffmpeg to decode every single frame that I received from my ip cam. The brief code looks like this: -(void) decodeFrame:(unsigned char *)frameData frameSize:(int)frameSize{ AVFrame frame; AVPicture picture; AVPacket pkt; AVCodecContext *context; pkt.data = frameData; pat.size = frameSize; avcodec_get_frame_defaults(&frame); avpicture_alloc(&picture, PIX_FMT_RGB24, targetWidth, targetHeight); avcodec_decode_video2(&context, &frame, &got_picture, &pkt); } The code woks fine,

How do I feed H.264 NAL units to Android MediaCodec for decoding?

百般思念 提交于 2019-12-20 09:45:34
问题 I'm trying to figure out how to use Android's MediaCodec class to decode H.264 video. To start, I'm trying to manually parse the NAL units out of an H.264 file and feed them to MediaCodec for decoding. I believe I'm parsing the NAL units out of the file correctly (searching for 0x00 0x00 0x01 sequence in file, indicates the start of a NAL unit), but MediaCodec always times out and returns -1 each time I make a call to dequeueOutputBuffer(). Does anyone know the specifics of how to feed H.264

Parsing H264 in mdat MP4

☆樱花仙子☆ 提交于 2019-12-20 09:31:57
问题 I have a file that only contains the mdat atom in a MP4 container. The data in the mdat contains AVC data. I know the encoding parameters for the data. The format does not appear to be in the Annex B byte stream format. I am wondering how I would go about parsing this. I have tried searching for the slice header, but have not had much luck. Is it possible to parse the slices without the NAL's? 回答1: AVC NAL units are in the following format in MDAT section: [4 bytes] = NAL length, network

H.264 over RTP - Identify SPS and PPS Frames

瘦欲@ 提交于 2019-12-20 08:38:31
问题 I have a raw H.264 Stream from an IP Camera packed in RTP frames. I want to get raw H.264 data into a file so I can convert it with ffmpeg . So when I want to write the data into my raw H.264 file I found out it has to look like this: 00 00 01 [SPS] 00 00 01 [PPS] 00 00 01 [NALByte] [PAYLOAD RTP Frame 1] // Payload always without the first 2 Bytes -> NAL [PAYLOAD RTP Frame 2] [... until PAYLOAD Frame with Mark Bit received] // From here its a new Video Frame 00 00 01 [NAL BYTE] [PAYLOAD RTP

MFT Encoder (h264) High CPU utilization

◇◆丶佛笑我妖孽 提交于 2019-12-20 02:37:05
问题 I am able successfully to encode the data by H264 using Media Foundation Transform (MFT) but unfortunately I got a very high CPU(when I comment in the program the calling of this function I got low CPU).It is few steps followed to get the encoding so I can't do anything to improve it?Any idea can help HRESULT MFTransform::EncodeSample(IMFSample *videosample, LONGLONG llVideoTimeStamp, MFT_OUTPUT_STREAM_INFO &StreamInfo, MFT_OUTPUT_DATA_BUFFER &encDataBuffer) { HRESULT hr; LONGLONG

Send Android h264 capture over a rtp stream

我怕爱的太早我们不能终老 提交于 2019-12-19 09:24:01
问题 I'm writing a rtp video streamer for android that reads h264 coded data from an Android local socket and packetize it. The thing is that I did it but I keep getting black frames in the client side (Voip). The communication goes like this: Android -> Asterisk -> Jitsi (Osx) (and reverse) There are a few things that I haven't understood yet: 1) Android's mediarecorder gives me a raw h264 stream, How can I know when a NAL starts / ends based on that stream? It doesn't have any 0x000001 pattern

H.264 video encoder in javascript

China☆狼群 提交于 2019-12-19 06:02:20
问题 I am looking to make a video encoder entirely in Javascript. The idea is that the user will be able to specify an existing video (easy enough) or a range of images and then be able to encode it to H.264 for publishing. I understand that encoding content is not supported right now but I was wondering if this is something that is possible entirely in Javascript (or a Flash bridge) or not? Thanks. 回答1: It is possible to compile a video encoder to javascript using emscripten. For example, here is

Why am I getting “Unsupported format” errors, reading H.264 encoded rtsp streams with the Android MediaPlayer?

拈花ヽ惹草 提交于 2019-12-19 05:51:57
问题 I am trying to show H.264 encoded rtsp video on an Android device. The stream is coming from a Raspberry Pi, using vlc to encode /dev/video1 which is a "Pi NoIR Camera Board". vlc-wrapper -vvv v4l2:///dev/video1 --v4l2-width $WIDTH --v4l2-height $HEIGHT --v4l2-fps ${FPS}.0 --v4l2-chroma h264 --no-audio --no-osd --sout "#rtp{sdp=rtsp://:8000/pi.sdp}" :demux=h264 > /tmp/vlc-wrapper.log 2>&1 I am using very minimal Android code right now: final MediaPlayer mediaPlayer = new MediaPlayer();