media-source

How to identify that html5 media element is stalled and waiting for further media to continue playing

百般思念 提交于 2020-01-10 20:14:29
问题 I am working with MediaSource and SourceBuffer to play html5 video. I am sequentially fetching DASH fragments to continue uninterrupted video play. But sometimes, due to network conditions, SourceBuffer runs out of data to continue play. When that data arrives play resumes. But between this period, video looks stalled. I want to add some visual indication over media element, that it is paused as its buffering required data. I tried binding 'waiting' and 'stalled' events on video, but none of

How to use ffmpeg for streaming mp4 via websocket

巧了我就是萌 提交于 2019-12-24 12:17:13
问题 I've written a sample in nodejs wich streams some input to the client via websocket connection in mp4 format. On the client side, the mp4 packages ar added to a MediaSourceBuffer. This runs fine, but only if the client gets the stream from the beginning with the first package. So another client can't play the current Stream, because he won't get the Stream from the beginning. I tried (try&error) to save the first package ffmpeg sends and send this at the beginning of a new connection, then

MediaSource randomly stops video

耗尽温柔 提交于 2019-12-22 06:29:36
问题 I am working on a project where I want to getUserMedia -> MediaRecorder -> socketIO -> MediaSource appendbuffer I got it to work, however after a few seconds it randomly stops. I know about WebRTC, but in the project I am working on it's based on an environment which is a version of Chrome embedded that doesn't support WebRTC. Server: 'use strict'; const io = require('socket.io')(); io.on('connection', (socket) => { console.log('connection'); socket.on('stream', (data) => { socket.emit(

Get mime type for MediaSource.isTypeSupported

回眸只為那壹抹淺笑 提交于 2019-12-22 04:39:25
问题 How do I get the Mime type I need to pass to MediaSource.isTypeSupported with ffprobe/ffmpeg? For instance, on my computer, that returns true : MediaSource.isTypeSupported('video/mp4; codecs="avc1.64000d,mp4a.40.2"') while that doesn't MediaSource.isTypeSupported('video/mp4') I'm not sure how to get what would correspond to the avc1.64000d,mp4a.40.2 part for a given video. Here is a larger list of what this part may look like. ffprobe -show_streams -i video.mp4 returns a number of interesting

How to keep a live MediaSource video stream in-sync?

假装没事ソ 提交于 2019-12-21 07:14:04
问题 I have a server application which renders a 30 FPS video stream then encodes and muxes it in real-time into a WebM Byte Stream. On the client side, an HTML5 page opens a WebSocket to the server, which starts generating the stream when connection is accepted. After the header is delivered, each subsequent WebSocket frame consists of a single WebM SimpleBlock. A keyframe occurs every 15 frames and when this happens a new Cluster is started. The client also creates a MediaSource, and on

Steaming a growing file using the MediaSource API.

南楼画角 提交于 2019-12-21 06:55:41
问题 So I have a downloading .mp4 file. I would like to stream the download file into a video element using the MediaSource API. How would I do this? const NUM_CHUNKS = 5; var video = document.querySelector('video'); video.src = video.webkitMediaSourceURL; video.addEventListener('webkitsourceopen', function(e) { var chunkSize = Math.ceil(file.size / NUM_CHUNKS); // Slice the video into NUM_CHUNKS and append each to the media element. for (var i = 0; i < NUM_CHUNKS; ++i) { var startByte = chunkSize

H264 video works using src attribute. Same video fails using the MediaSource API (Chromium)

爱⌒轻易说出口 提交于 2019-12-18 17:14:13
问题 http://www.youtube.com/html5 indicates that Google Chrome is compliant with MediaSource Extensions & H.264. I make a simple test checking that my video is supported by Chromium, using the <video id='player' autoplay='true'> <source src='/test.mp4' type='video/mp4' /> </video> The video plays smoothly. A second alternative that also works fine consists in loading through AJAX the byte chain and converting the buffer to a URI object. Then asigning such URI to the (video) source.src attribute.

Unable to get MediaSource working with mp4 format in chrome

删除回忆录丶 提交于 2019-12-18 11:39:15
问题 Based on the example here I downloaded the webm file and encoded as an mp4 file which will play locally but I'm unable to use it as a media source. MP4Box reports the codec to be avc1.64000d,mp4a.40.2 but adding it to the source buffer did not help. Here is a demo of the problem (I don't expect it to work in firefox as Media Source Extensions are not supported yet) and here is the code I'm testing with: var FILE,CODEC,mediaSource; var NUM_CHUNKS = 5; var video = document.querySelector('video'

Flush & Latency Issue with Fragmented MP4 Creation in FFMPEG

为君一笑 提交于 2019-12-17 15:47:08
问题 I'm creating a fragmented mp4 for html5 streaming, using the following command: -i rtsp://172.20.28.52:554/h264 -vcodec copy -an -f mp4 -reset_timestamps 1 -movflags empty_moov+default_base_moof+frag_keyframe -loglevel quiet - "-i rtsp://172.20.28.52:554/h264" because the source is h264 in rtp packets stream from an ip camera. For the sake of testing, the camera is set with GOP of 1 (i.e. all frames are key frames) "-vcodec copy" because I don't need transcoding, only remuxing to mp4. "

Live streaming dash content using mp4box

空扰寡人 提交于 2019-12-17 08:54:22
问题 I'm trying to live stream H.264 content to HTML5 using the media source extensions API. The following method works pretty well: ffmpeg -i rtsp://10.50.1.29/media/video1 -vcodec copy -f mp4 -reset_timestamps 1 -movflags frag_keyframe+empty_moov -loglevel quiet out.mp4 and then: mp4box -dash 1000 -frag 1000 -frag-rap out.mp4 I can take the MP4Box output ( out_dashinit.mp4 ) and send it through Web Sockets, chunk by chunk, to a JavaScript client that feeds it to the media source API. However,