How to use ffmpeg for streaming mp4 via websocket

巧了我就是萌 提交于 2019-12-24 12:17:13

问题


I've written a sample in nodejs wich streams some input to the client via websocket connection in mp4 format. On the client side, the mp4 packages ar added to a MediaSourceBuffer.

This runs fine, but only if the client gets the stream from the beginning with the first package. So another client can't play the current Stream, because he won't get the Stream from the beginning.

I tried (try&error) to save the first package ffmpeg sends and send this at the beginning of a new connection, then the current stream. Then the MediaSourceBuffer breaks because of encoding error..

Here is the ffmpeg command :

-i someInput -g 59 
-vcodec libx264 -profile:v baseline 
-f mp4 -movflags empty_moov+omit_tfhd_offset+frag_keyframe+default_base_moof
-reset_timestamps 1
-

The part "empty_moov+omit_tfhd_offset+frag_keyframe+default_base_moof" should make the Streampackages independet in putting the moovatom at the beginning of each part and sizing the parts in 59 frames each by keyframe, so i dont get it why i cant view the Stream beginning after the start.


回答1:


The output of that command is not a 'stream' per se. It is series of concatenated fragments. Each fragments must be received in its entirety. If a partial fragment is received it will confuse the parser to the point where it can not identify the start of the next fragment. In addition, the first fragment output is called an initialization fragment. This initialization fragment must be sent to the client first. After that any fragment can be played. Hence it must be cached by the server.



来源:https://stackoverflow.com/questions/31834456/how-to-use-ffmpeg-for-streaming-mp4-via-websocket

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!