h.264

H264 Video Streaming over RTMP on iOS

假装没事ソ 提交于 2019-12-22 10:05:22
问题 With a bit of digging, I have found a library that extracts NAL units from .mp4 file while it is being written. I'm attempting to packetize this information to flv over RTMP using libavformat and libavcodec . I setup a video stream using: -(void)setupVideoStream { int ret = 0; videoCodec = avcodec_find_decoder(STREAM_VIDEO_CODEC); if (videoCodec == nil) { NSLog(@"Could not find encoder %i", STREAM_VIDEO_CODEC); return; } videoStream = avformat_new_stream(oc, videoCodec); videoCodecContext =

Convert H.264 Annex B to MPEG-TS

柔情痞子 提交于 2019-12-22 08:19:11
问题 SO... I have RAW H.264 video data captured via RTSP in a local file and I am attempting to playback the video in a Java FX application. In order to do this, I need to use Http Live Streaming. I have successfully prototyped a Java FX architecture that can play a video via HLS with a local server using a local folder containing a .m3u8 (HLS index) file and collection of .ts (MPEG-TS) files. The last piece for me is to replace the .ts files with .264 / .h264 files and in the local server,

Why did H.264, MPEG-4 HE AAC stop working on iphoneXS/Max?

假装没事ソ 提交于 2019-12-22 07:01:12
问题 Issue regarding NEW hardware I have been investigating like crazy, and haven't found any hints to why my H.264 encoded videos have stopped working on these new devices. Context: Direct from the ios device, the original is sent to s3, aws elastic transcoder then encodes the original into a more compressed H.264 preset. As of yesterday, a coworker was reporting all videos being "black", now since deliveries on these devices are being fulfilled, i've gotten confirmation. Cannot reproduce this

Why does AVSampleBufferDisplayLayer fail with Operation Interrupted (-11847)?

非 Y 不嫁゛ 提交于 2019-12-22 06:56:41
问题 I'm using an AVSampleBufferDisplayLayer to decode and display H.264 video streamed from a server. When my app goes into the background and then returns to the foreground, the decoding process gets screwed up and the AVSampleBufferDisplayLayer fails. The error I'm seeing is: H.264 decoding layer has failed: Error Domain=AVFoundationErrorDomain Code=-11847 "Operation Interrupted" UserInfo=0x17426c500 {NSUnderlyingError=0x17805fe90 "The operation couldn’t be completed. (OSStatus error -12084.)",

H.264 Frames Memory Leak With Some Decoders

不羁的心 提交于 2019-12-22 04:34:42
问题 I'm receiving an H.264 stream from a DVR using its SDK. There were memory leaks and i thought it was the SDK causing all the leaks. But when i recorded the stream and played the frames one by one reading from the disk (without any 3rd party dlls involved), i noticed that the problem is not the dll but the stream itself. Strange enough, DivX H264 Decoder is the only codec which doesn't cause a memory leak but when the stream runs for a long time, sometimes DivX decoder crashes as well. I'd

SPS values for H 264 stream in iPhone

 ̄綄美尐妖づ 提交于 2019-12-22 00:33:35
问题 Can someone point me to documentation that will help me get correct SPS and PPS values for iPhone. 回答1: Question is a bit unclear... Picture Parameter Set is described in the latest ITU-T release of the standard in chapter 7.3.2.2 Sequence Parameter Set is described in chapter 7.3.2.1. 回答2: You can encode a single frame to a file and then extract the sps and pps from that file. I have an example that shows how to do exactly that at http://www.gdcl.co.uk/2013/02/20/iOS-Video-Encoding.html 回答3:

MFCreateFMPEG4MediaSink does not generate MSE-compatible MP4

拟墨画扇 提交于 2019-12-21 22:00:38
问题 I'm attempting to stream a H.264 video feed to a web browser. Media Foundation is used for encoding a fragmented MPEG4 stream ( MFCreateFMPEG4MediaSink with MFTranscodeContainerType_FMPEG4 , MF_LOW_LATENCY and MF_READWRITE_ENABLE_HARDWARE_TRANSFORMS enabled). The stream is then connected to a web server through IMFByteStream . Streaming of the H.264 video works fine when it's being consumed by a <video src=".."/> tag. However, the resulting latency is ~2sec, which is too much for the

WebRTC: What is RTPFragmentationHeader in encoder implementation?

笑着哭i 提交于 2019-12-21 21:24:53
问题 I have modified h264_encoder_impl to use nvidia grid based hardware encoder. This is done by replacing OpenH264 specific calls with Nvidia API calls. Encoded stream can be written to file successfully but writing _buffer and _size of encoded_image_ are not enough and RTPFragmentationHeader also needs to be filled. // RtpFragmentize(EncodedImage* encoded_image, // std::unique_ptr<uint8_t[]>* encoded_image_buffer, // const VideoFrameBuffer& frame_buffer, // SFrameBSInfo* info, //

Manual encoding into MPEG-TS

不问归期 提交于 2019-12-21 21:14:34
问题 SO... I am trying to take a H264 Annex B byte stream video and encode it into MPEG-TS in pure Java. My goals is to create a minimal MPEG-TS, Single Program, valid stream and to not include any timing information information (PCR, PTS, DTS). I am currently at the point where my generated file can be passed to ffmpeg (ffmpeg -i myVideo.ts) and ffmpeg reports... [NULL @ 0x7f8103022600] start time is not set in estimate_timings_from_pts Input #0, mpegts, from 'video.ts': Duration: N/A, bitrate: N

Raw H.264 stream output by MediaCodec not playble

蹲街弑〆低调 提交于 2019-12-21 20:46:37
问题 I am creating raw H.264 stream output by MediaCodec. The problem is the output file is not playable in android default player (API 16). How can it be that Android can export file that is not playable in player, only in VLC on the PC. Maybe some thing wrong with my code? My video is 384x288. public class AvcEncoder { private MediaCodec mediaCodec; private BufferedOutputStream outputStream; private File f; public AvcEncoder(int w, int h, String file_name) { f = new File(file_name + ".mp4"); try