mediastream

Saving desktopCapturer to video file in Electron

℡╲_俬逩灬. 提交于 2019-12-03 03:26:58
The desktopCapturer api example shows how to write a screen capture stream to a <video> element. // In the renderer process. var desktopCapturer = require('electron').desktopCapturer; desktopCapturer.getSources({types: ['window', 'screen']}, function(error, sources) { if (error) throw error; for (var i = 0; i < sources.length; ++i) { if (sources[i].name == "Electron") { navigator.webkitGetUserMedia({ audio: false, video: { mandatory: { chromeMediaSource: 'desktop', chromeMediaSourceId: sources[i].id, minWidth: 1280, maxWidth: 1280, minHeight: 720, maxHeight: 720 } } }, gotStream,

Blob video duration metadata [duplicate]

杀马特。学长 韩版系。学妹 提交于 2019-12-03 02:22:32
This question already has an answer here: How can I add predefined length to audio recorded from MediaRecorder in Chrome? 3 answers I am writing a software that manipulates camera stream video in firefox. I am generating a Blob with video type recorded with MediaRecorder API . What i am doing to save the blob as video in local storage is using FileSaver library : FileSaver.saveAs(BlobVideo,"video.mp4"); It seems the video doesnt have any max duration, so i cannot navigate in timeline in my newly generated video in VLC, for example. Is there a way to set duration metadatas on a blob video?

webRTC convert webm to mp4 with ffmpeg.js

南楼画角 提交于 2019-12-02 21:08:11
I am trying to convert webM files to mp4 with ffmpeg.js. I am recording a video from canvas(overlayer with some information) and recording the audio data from the video. stream = new MediaStream(); var videoElem = document.getElementById('video'); var videoStream = videoElem.captureStream(); stream.addTrack(videoStream.getAudioTracks()[0]); stream.addTrack(canvas.captureStream().getVideoTracks()[0]); var options = {mimeType: 'video/webm'}; recordedBlobs = []; mediaRecorder = new MediaRecorder(stream, options); mediaRecorder.onstop = handleStop; mediaRecorder.ondataavailable =

Canvas recording using captureStream and mediaRecorder

偶尔善良 提交于 2019-11-27 21:57:11
问题 How can i record streams from more than one canvas? ie, when i change one canvas to other it has to record the active canvas continue to the first. I have done like this: stream = canvas.captureStream(); mediaRecorder = new MediaRecorder(stream, options); mediaRecorder.ondataavailable = handleDataAvailable; mediaRecorder.start(10); function handleDataAvailable(event) { recordedBlobs.push(event.data); } But when adding another stream, only the first part is recorded. I'am pushing recorded data

How to addTrack in MediaStream in WebRTC

放肆的年华 提交于 2019-11-27 02:09:15
I'm using webrtc to communicate between to peers. I wan't to add new track to old generated stream, as I wan't to give functionality to users to switch their microphones during audio communications. The code I'm using is, Let "pc" be the peerConnection object through which audio communication takes place & "newStream" be the new generated MediaStream got from getUserMedia function with new selected microphone device. var localStreams = pc.getLocalStreams()[0]; localStreams.removeTrack(localStreams.getAudioTracks()[0]); var audioTrack = newStream.getAudioTracks()[0]; localStreams.addTrack

How to use web audio api to get raw pcm audio?

心已入冬 提交于 2019-11-26 23:29:55
问题 How usergetmedia to use the microphone in chrome and then stream to get raw audio? I need need to get the audio in linear 16. 回答1: Unfortunately, the MediaRecorder doesn't support raw PCM capture. (A sad oversight, in my opinion.) Therefore, you'll need to get the raw samples and buffer/save them yourself. You can do this with the ScriptProcessorNode. Normally, this Node is used to modify the audio data programmatically, for custom effects and what not. But, there's no reason you can't just

CanvasCaptureMediaStream / MediaRecorder Frame Synchronization

本小妞迷上赌 提交于 2019-11-26 20:57:10
When using CanvasCaptureMediaStream and MediaRecorder, is there a way to get an event on each frame? What I need is not unlike requestAnimationFrame() , but I need it for the CanvasCaptureMediaStream (and/or the MediaRecorder) and not the window. The MediaRecorder could be running at a different frame rate than the window (possibly at a not regularly divisible rate, such as 25 FPS vs 60 FPS), so I want to update the canvas at its frame rate rather than the window's. This example currently only fully works on FireFox , since chrome simply stops the canvas stream when the tab is blurred...

MediaStream Capture Canvas and Audio Simultaneously

跟風遠走 提交于 2019-11-26 17:46:30
I'm working on a project in which I'd like to: Load a video js and display it on the canvas. Use filters to alter the appearance of the canvas (and therefore the video). Use the MediaStream captureStream() method and a MediaRecorder object to record the surface of the canvas and the audio of the original video. Play the stream of both the canvas and the audio in an HTML video element. I've been able to display the canvas recording in a video element by tweaking this WebRTC demo code: https://webrtc.github.io/samples/src/content/capture/canvas-record/ That said, I can't figure out how to record