问题
I'm trying to play video from websocket
<video id="output" width="320" height="240" autoplay></video>
<script>
function sockets(buffer) {
const socket = new WebSocket('wss://localhost:5002/ws')
socket.onmessage = async function (event) {
// event.data is a blob
buffer.appendBuffer(new Uint8Array(event.data))
}
}
let ms = new MediaSource()
let output = document.getElementById('output')
output.src = URL.createObjectURL(ms)
ms.onsourceopen = () => {
let buffer = ms.addSourceBuffer('video/webm; codecs="vorbis,vp8"')
sockets(buffer)
}
</script>
I receive MediaRecorder chunks here as Blobs and try to sequentially play them using MediaSource API. No errors and nothing happens. Is there something fundamentally wrong here?
I tried:
- To use different codecs
- Played with media source modes e.g sequence/segments
- I was also trying different ways where you don't use MediaSource API but faced other challenges and MediaSource seems to be the best approach in my case.
UPDATE: this is how the video is produced:
let options = { mimeType: 'video/webm;codecs=vp8' }
let stream = await navigator.mediaDevices.getUserMedia({ video: true })
mediaRecorder = new MediaRecorder(stream, options)
mediaRecorder.ondataavailable = event => {
if (event.data && event.data.size > 0) {
send(event.data)
}
}
回答1:
The fundamental problem here is you cannot stream those data coming out of MediaRecorder
and expect the other end to play it; it is not a complete video. It will only work if the receiving end is able to receive the initialization bytes--which I doubt that will work in a real-world scenario.
What you can do is to create an interval that will start/stop the MediaRecorder
for example every 1 second to make 1 second video chunks that you can transmit over the wire (best I know and tested is websockets)
I strongly suggest not to use MediaRecorder
is you are doing real-time video streaming which was not indicated in your post, but if yes, it would be better that you create a canvas to copy the stream and do some requestAnimationFrame
stuff that can capture your video stream into a something you can transmit.
Take a look at this demo for reference: https://github.com/cyberquarks/quarkus-websockets-streamer/blob/master/src/main/resources/META-INF/resources/index.html
MediaRecorder
in my experience response is delayed that would generally add quite a delay in the video, not to mention the delay that the socket would also introduce.
Generally, other developers would suggest that you just take the WebRTC route, however based on my experience also WebRTC is not generally faster.
来源:https://stackoverflow.com/questions/61459893/html5-video-stream-from-websocket-via-mediasource-and-mediasourcebuffer