h.264

Get the width / height of the video from H.264 NALU

谁说胖子不能爱 提交于 2020-01-09 18:22:29
问题 I have gotten the SPS in NALU ( AVC Decoder Configuration Record ), and trying to parse the video width / height from it. 67 64 00 15 ac c8 60 20 09 6c 04 40 00 00 03 00 40 00 00 07 a3 c5 8b 67 80 This is my code parse the SPS but gets the wrong values. pic_width_in_mbs_minus1 is 5, and pic_height_in_map_units_minus1 is 1. Actually the video is 512 X 288px typedef struct _SequenceParameterSet { private: const unsigned char * m_pStart; unsigned short m_nLength; int m_nCurrentBit; unsigned int

decode h264 raw stream using mediacodec

寵の児 提交于 2020-01-06 07:00:33
问题 I recieve h264 data from server, I want to decode this stream using mediacodec and texture view on android.I got the data from the server , parssing it to get the SPS , the PPS and the video frame data, then I passed this data to the mediacodec , but the function dequeueOutputBuffer(info, 100000) always returns -1 and I get dequeueOutputBuffer timed out. Any help please, I'am stucked at this issues from three weeks. this is the code used to decode the video frame. public class

H264 video source is not playing in any browser

旧街凉风 提交于 2020-01-06 06:48:28
问题 I'm using Janus to pass video stream from an RTSP/H.264 camera (QD800) through WebRTC to browsers. Which browsers am I speaking here? Here are the details: Firefox (Linux, 64-bit, version 59.0.2) Firefox (Windows7, 64-bit, version 59.0.2) Chrome (Linux, 64-bit, version 49.0.2623.87) Chrome (Windows7, 64-bit, version 66.0.3359.139) Originally the camera sends 420029 as profile-level-id in the SDP when negotiating with the client (Janus). However as such Firefox is not able to play this version

opencv single h264 raw frame as a binary string

倾然丶 夕夏残阳落幕 提交于 2020-01-03 17:17:09
问题 have created a rtsp client in python that receives a h264 stream and returns single h264 raw frames as a binary strings. I am trying to process each h264 frames on-the-fly. I have unsuccessfully tried several ways to convert this frame into a numpy array for processing. So far I know that cv2.VideoCapture only accepts a file name as it argument, not a frame neither a StringIO object (file like pointer to a buffer), but I need to pass to it my string. I have also tried something like: nparr =

Build ffmpeg for windows phone 8

雨燕双飞 提交于 2020-01-03 09:28:32
问题 How could I build ffmpeg for windows phone 8, arm?. I couldn't find any info about it. Could I call from c++ to h.264 hardware encoder/decoder in windows phone 8? It looks like Media Foundation is too limitted. Thank you. 回答1: AFAIK it is not possible to compile ffmpeg for Windows Phone 8 at the moment due to a missing toolchain. But there is a kickstarter project for a Windows 8 port (http://www.kickstarter.com/projects/1061646928/vlc-for-the-new-windows-8-user-experience-metro), stating

Publish webcam feed to Flash Media Server

匆匆过客 提交于 2020-01-03 00:38:20
问题 I have a fairly high-end webcam (snc-rz25n) that I need to rebroadcast using the Flash Media Server. I can get the picture as MPEG-4 (not h.264). So I need to transcode to h.264 and publish at multiple bitrates to FMS. The only solution I have been able to come up with thus far is to transcode the stream using ffmpeg and then also use ffmpeg to downconvert the stream (for the multiple bitrates) and then publish all of these transcoded streams to FMS via custom Java code (using Red5). Surely

Android: mpeg4/H.264 packetization example

让人想犯罪 __ 提交于 2020-01-02 09:59:36
问题 I need to split mpeg4 video stream (actually from android video camera) to send it through RTP. The specification is little large for quick reference. I wonder if there any example/open source code for mpeg4 packetization? Thanks for any help ! 回答1: Mpeg4 file format is also called ISO/IEC 14496-14. Google it any you will find specifications. However, what you are trying to do (RTP publisher) will be hard for the following reasons: Mpeg4 has header at the end of the file. Which means header

How do I convert an h.264 stream to MP4 using ffmpeg and pipe the result to the client?

 ̄綄美尐妖づ 提交于 2020-01-02 08:11:30
问题 I have an h.264 encoded stream of video on my server (node.js) and I want to use ffmpeg to convert it to an MP4 stream. Then I want to pipe that MP4 stream from the child process to the client using the response of an HTTP server that I have set up. I am very confused about all the options ffmpeg has and not sure how to pipe the output of the child process to the HTTP response. I have tried several combinations of ffmpeg options but the video does not play in the browser (or show any sign of

Putting an H.264 I frame to AVSampleBufferDisplayLayer but no video image is displayed

元气小坏坏 提交于 2020-01-02 04:49:06
问题 After having a detail review of WWDC2014,Session513, I try to write my app on IOS8.0 to decode and display one live H.264 stream. First of all, I construct a H264 parameter set successfully. When I get one I frame with a 4 bit start code,just like"0x00 0x00 0x00 0x01 0x65 ...", I put it into a CMblockBuffer. Then I construct a CMSampleBuffer using previews CMBlockBuffer. After that,I put the CMSampleBuffer into a AVSampleBufferDisplayLayer. Everything is OK(I checked the value returned )

Cannot play certain videos

心不动则不痛 提交于 2020-01-02 03:50:29
问题 I'm trying to play movies on the Android device from our server. It is not a media server, just a regular Apache server. We use the same API to access the videos on the iPhone and it works fine. On the Android device, certain videos work, and others do not. They were all created the same way, except the majority of the ones that don't work are composed of still images and audio. We have tried re-encoding them with Videora, and tried hinting them with MP4Box. All of the videos play perfectly