media-source

'audio/wav' mime type not supported by MSE SouceBuffer object, but audio src attribute pointing at .wav plays just fine

狂风中的少年 提交于 2020-07-22 05:53:07
问题 I'm trying create a SourceBuffer from the W3 Media Source Extension API with Mime type 'audio/wav' like so: let sourceBuffer = mediaSource.addSourceBuffer('audio/wav'); However I get a "NotSupportedError": Failed to execute 'addSourceBuffer' on 'MediaSource': The type provided ('audio/wav') is unsupported. Also, running the following: MediaSource.isTypeSupported('audio/wav'); in the browser console returns false for both recent versions of firefox and chrome. If I just set the src of the

Is there a way to send video data from a video tag/MediaStream to an OffscreenCanvas?

余生长醉 提交于 2020-07-16 07:40:34
问题 Basically I want to be able to perform effectively this same code: const video = document.getElementById('video'); const canvas = document.getElementById('canvas'); const context = canvas.getContext('2d'); const draw = () => { context.drawImage(video, 0, 0); requestAnimationFrame(draw); } video.onplay = () => { requestAnimationFrame(draw); } only using an offscreen canvas. I can send images over messages to the worker the offscreen canvas is on, but not video as it's directly tied to an

Is there a way to send video data from a video tag/MediaStream to an OffscreenCanvas?

扶醉桌前 提交于 2020-07-16 07:40:22
问题 Basically I want to be able to perform effectively this same code: const video = document.getElementById('video'); const canvas = document.getElementById('canvas'); const context = canvas.getContext('2d'); const draw = () => { context.drawImage(video, 0, 0); requestAnimationFrame(draw); } video.onplay = () => { requestAnimationFrame(draw); } only using an offscreen canvas. I can send images over messages to the worker the offscreen canvas is on, but not video as it's directly tied to an

Is there a way to send video data from a video tag/MediaStream to an OffscreenCanvas?

一笑奈何 提交于 2020-07-16 07:40:14
问题 Basically I want to be able to perform effectively this same code: const video = document.getElementById('video'); const canvas = document.getElementById('canvas'); const context = canvas.getContext('2d'); const draw = () => { context.drawImage(video, 0, 0); requestAnimationFrame(draw); } video.onplay = () => { requestAnimationFrame(draw); } only using an offscreen canvas. I can send images over messages to the worker the offscreen canvas is on, but not video as it's directly tied to an

HTML5 Video stream from websocket via MediaSource and MediaSourceBuffer

醉酒当歌 提交于 2020-07-15 09:14:27
问题 I'm trying to play video from websocket <video id="output" width="320" height="240" autoplay></video> <script> function sockets(buffer) { const socket = new WebSocket('wss://localhost:5002/ws') socket.onmessage = async function (event) { // event.data is a blob buffer.appendBuffer(new Uint8Array(event.data)) } } let ms = new MediaSource() let output = document.getElementById('output') output.src = URL.createObjectURL(ms) ms.onsourceopen = () => { let buffer = ms.addSourceBuffer('video/webm;

If Blob URLs are immutable, how does Media Source Extension API use them to stream videos?

三世轮回 提交于 2020-05-15 03:59:26
问题 Let's start with an example: You visit youtube.com, which uses Media Source Extension (MSE) with HTML5 for certain devices. MSE injects the <video> tag with a blob URL. It looks something like this: blob:https://www.youtube.com/blahblahblah" Throughout streaming the entire video, your browser makes multiple network calls to download the various chunks of video, and appends them to the MSE's SourceBuffer Therefore, the Meda Source object as a whole is updated throughout the video stream

FMP4 moof box sequence number ordering

*爱你&永不变心* 提交于 2020-03-25 18:04:10
问题 I wanted to do a basic fragmented mp4 broadcast program with avformat libs and HTML5 video and MSE. This is a live stream and I use avformat to copy h264 data to mp4 fragments. Here is my basic drawing of clients attaching to the stream: So, with words: C1J: First Client joins: avformat process starts ftyp, moov, moof, mdat boxes will be served to Client1 ftyp and moov atoms are both saved for later reuse C2J: Second Client joins (later in time): avformat process is ongoing (because it is

FMP4 moof box sequence number ordering

丶灬走出姿态 提交于 2020-03-25 18:02:12
问题 I wanted to do a basic fragmented mp4 broadcast program with avformat libs and HTML5 video and MSE. This is a live stream and I use avformat to copy h264 data to mp4 fragments. Here is my basic drawing of clients attaching to the stream: So, with words: C1J: First Client joins: avformat process starts ftyp, moov, moof, mdat boxes will be served to Client1 ftyp and moov atoms are both saved for later reuse C2J: Second Client joins (later in time): avformat process is ongoing (because it is