Save captured video to file in Electron

荒凉一梦 提交于 2019-12-25 04:28:08

问题


I want to save a video that is captured from webcam to a local file. So far I have been able to:

  1. Create a stream with getUserMedia
  2. Encapsulate the stream with RecordRTC
  3. Get blob from RecordRTC

I cannot figure out how to save the video to a file though. Calling save() on RecordRTC allows me to download the video file, but I want everything to happen in nodejs for further processing. The file is playable, regardless. I tried to write the blob and dataURL to file, but that file is not playable.


回答1:


The MediaRecorder class implemented in Chromium and thus Electron, can support your use case alone save for the part of writing the media to a local file. The latter functionality isn't, to my knowledge, possible to achieve using any standardized Web APIs (as of the time of writing this), but since Electron embeds Node.js, it's not only possible but is arguably trivial:

navigator.mediaDevices.getUserMedia().then(function(stream) {
    const recorder = new MediaRecorder(stream);
    const blob_reader = new FileReader();
    const storage_stream = require("fs").createWriteStream(path);
    const blobs = [];
    blob_reader.addEventListener("load", function(ev) {
        storage_stream.write(Buffer.from(ev.currentTarget.result));
        if(blobs.length) {
            ev.currentTarget.readAsArrayBuffer(blobs.shift());
        }
    });
    recorder.addEventListener("dataavailable", function(ev) {
        if(blob_reader.readyState != 1) {
            blob_reader.readAsArrayBuffer(ev.data);
        } else {
            blobs.push(ev.data);
        }
    });
});

Short of what I consider a technically expendable and undesired (but necessary) step of converting Blob objects to ArrayBuffer equivalents, this is as efficient as the API implementation itself is -- the JavaScript machine itself does no heavy lifting here.

Remarks and explanations on the snippet above

  • There won't be any action until you actually start the media recorder by issuing a call to the MediaRecorder.start method. Note that the snippet is made to work with multiple generated blobs, if needed -- using a timeslice (first argument to start) of 1 second may be a good idea, depending. Such timeslice allows you to do proper streaming of data, as opposed to getting a potentially gigantic single blob worth of encoded video stream stored in process memory (which is what you get if you omit the timeslice parameter to start).
  • As soon as a call to start is issued (with a timeslice), the resulting file will start growing "on disk", depending on the timeslice value and lengths of intermediate buffers.
  • A MediaRecorder object, as part of encoding media, generates blobs which, for one reason or another, aren't very "consumable" by many other APIs, so we have to convert them to something that is consumable, in this case objects of the more convenient for us ArrayBuffer class.
  • Since conversion of a blob into an array buffer is asynchronous, we have a queue of blobs that is duly converted on a FIFO basis.
  • require("fs") gets us a Node.js module, "fs" is not a module otherwise available in your Web browser, at least not according to a Web standard draft I know of. That's the module, however, that allows us to dump the resulting array buffers into a file.
  • The perhaps inconspicuous expression Buffer.from(...) is more than meets the eye here -- there is no Buffer class in the Web API space, it's a Node.js class that is able to wrap an ArrayBuffer as a view (no data copying). This is necessary because you can't write an ArrayBuffer into a file stream directly.
  • A concatenation of data generated by a media recorder object is a valid media container (with respect to the MIME type of the recorder object), so just handing these data to the stream, when done in proper order of course, is sufficient to get us valid media container file (e.g. a .webm or an .mp4).
  • The resulting file is, however, a so called transport stream -- some video players may or may not be able to reliably or efficiently seek on the data. ffmpeg, however, can trivially postprocess such files by indexing them and patching them accordingly. I consider such step optional -- there is nothing inherently wrong with a transport stream like the ones generated by the snippet above.
  • The path variable passed to createWriteStream above denotes the path to the file you want to save the video to. The variable isn't declared or defined in the snippet, but it obviously must be.

I have tested the snippet with the 3.0.4 version of Electron on Windows 10.




回答2:


I have never used the RecordRTC. Worked in use native(JavaScript) MediaRecorder API to record.
I wrote sample



来源:https://stackoverflow.com/questions/41046709/save-captured-video-to-file-in-electron

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!