media-source

Flush & Latency Issue with Fragmented MP4 Creation in FFMPEG

为君一笑 提交于 2019-11-27 19:55:19
I'm creating a fragmented mp4 for html5 streaming, using the following command: -i rtsp://172.20.28.52:554/h264 -vcodec copy -an -f mp4 -reset_timestamps 1 -movflags empty_moov+default_base_moof+frag_keyframe -loglevel quiet - "-i rtsp://172.20.28.52:554/h264" because the source is h264 in rtp packets stream from an ip camera. For the sake of testing, the camera is set with GOP of 1 (i.e. all frames are key frames) "-vcodec copy" because I don't need transcoding, only remuxing to mp4. "-movflags empty_moov+default_base_moof+frag_keyframe" to create a fragmented mp4 according to the media

Media Source Api not working for a custom webm file (Chrome Version 23.0.1271.97 m)

怎甘沉沦 提交于 2019-11-27 09:20:50
I am referring a media source api demo given on this link It is working fine for the given test webm file but when i tried to change the name of the file to a custom webm file the code stopped working. It is generating following error : Uncaught Error: INVALID_STATE_ERR: DOM Exception 11 at following code : sourceBuffer.append(new Uint8Array(e.target.result)); To check whether the custom webm file is working i have created a test page in which i have defined a video tag having source of that custom webm file. When i ran the code it is working fine. I am unable to understand the reason for this

Live streaming dash content using mp4box

断了今生、忘了曾经 提交于 2019-11-27 07:01:29
I'm trying to live stream H.264 content to HTML5 using the media source extensions API. The following method works pretty well: ffmpeg -i rtsp://10.50.1.29/media/video1 -vcodec copy -f mp4 -reset_timestamps 1 -movflags frag_keyframe+empty_moov -loglevel quiet out.mp4 and then: mp4box -dash 1000 -frag 1000 -frag-rap out.mp4 I can take the MP4Box output ( out_dashinit.mp4 ) and send it through Web Sockets, chunk by chunk, to a JavaScript client that feeds it to the media source API. However, this is not a good method for live content. What I'm trying to do now, is to create a single pipeline in

Unable to stream video over a websocket to Firefox

谁说胖子不能爱 提交于 2019-11-27 03:23:08
问题 I have written some code stream video over a websocket so a sourcebuffer which works in Chrome and Edge. However, when I run this in Firefox, the video never plays back, just a spinning wheel animation is displayed. When I check the <video> statistics, It reads HAVE_METADATA as the ready state and NETWORK_LOADING as the network state. The code looks follows: <!DOCTYPE html> <html> <head> <meta charset="utf-8"/> </head> <body> <video controls></video> <script> var mime = 'video/mp4; codecs=

How to use “segments” mode at SourceBuffer of MediaSource to render same result at Chomium, Chorme and Firefox?

a 夏天 提交于 2019-11-26 18:37:02
问题 After further development of the code at OP at How to use Blob URL, MediaSource or other methods to play concatenated Blobs of media fragments? have been able to achieve requirement of recording discrete media fragments using MediaRecorder , adding cues to the resulting webm file using ts-ebml and recording the discrete media fragments as a single media file using MediaSource with .mode of SourceBuffer set to "sequence" at both Chromium and Firefox browsers . The Chromium issue at Monitor and

Media Source Api not working for a custom webm file (Chrome Version 23.0.1271.97 m)

依然范特西╮ 提交于 2019-11-26 17:49:34
问题 I am referring a media source api demo given on this link It is working fine for the given test webm file but when i tried to change the name of the file to a custom webm file the code stopped working. It is generating following error : Uncaught Error: INVALID_STATE_ERR: DOM Exception 11 at following code : sourceBuffer.append(new Uint8Array(e.target.result)); To check whether the custom webm file is working i have created a test page in which i have defined a video tag having source of that

What exactly is Fragmented mp4(fMP4)? How is it different from normal mp4?

故事扮演 提交于 2019-11-26 09:15:57
问题 Media Source Extension ( MSE ) needs fragmented mp4 for playback in the browser. 回答1: A fragmented MP4 contains a series of segments which can be requested individually if your server supports byte-range requests. Boxes aka Atoms All MP4 files use an object oriented format that contains boxes aka atoms. You can view a representation of the boxes in your MP4 using an online tool such as MP4 Parser or if you're using Windows, MP4 Explorer. Let's compare a normal MP4 with one that is fragmented:

How to use Blob URL, MediaSource or other methods to play concatenated Blobs of media fragments?

无人久伴 提交于 2019-11-26 05:35:24
问题 Am attempting to implement, for lack of a different description, an offline media context. The concept is to create 1 second Blob s of recorded media, with the ability to Play the 1 second Blobs independently at an HTMLMediaElement Play the full media resource from concatenated Blob s The issue is that once the Blob s are concatenated the media resource does not play at HTMLMedia element using either a Blob URL or MediaSource . The created Blob URL only plays 1 second of the concatenated Blob