Recording video simultaneously with audio in chrome blocks on main thread, causing invalid audio

六月ゝ 毕业季﹏ 提交于 2019-12-07 12:54:00

问题


So, I have what I think is a fairly interesting and, hopefully, not intractable problem. I have an audio/video getUserMedia stream that I am recording in Chrome. Individually, the tracks record perfectly well. However, when attempting to record both, one blocks the main thread, hosing the other. I know that there is a way to resolve this. Muaz Khan has a few demos that seem to work without blocking.

Audio is recorded via the web audio API. I am piping the audio track into a processor node which converts it to a 16b mono channel and streams it to a node.js server.

Video is recorded via the usual canvas hack and Whammy.js. When recording, video frames are drawn to a canvas and then the resulting image data is pushed into a frames array which is later encoded into a webm container by Whammy, subsequently uploaded to the node.js server.

The two are then muxed together via ffmpeg server-side and the result stored.

The ideas I've had so far are:

  • Delegate one to a worker thread. Unfortunately both the canvas and the stream are members of the DOM as far as I know.
  • Install headless browser in node.js and establish an rtc connection with the client, thereby exposing the entire stream server-side

The entire situation will eventually be solved by Audio Worker implementation. The working group seems to have stalled public progress updates on that while things are shuffled around a bit though.

Any suggestions for resolving the thread blocking?

Web Audio Connections:

var context = new AudioContext();
var source = context.createMediaStreamSource(stream);
var node = context.createScriptProcessor(2048, 1, 1);
node.onaudioprocess = audioProcess;
source.connect(node);
node.connect(context.destination);

Web Audio Processing:

if (!recording.audio) return;
var leftChannel = e.inputBuffer.getChannelData(0);
Socket.emit('record-audio', convertFloat32ToInt16(leftChannel));

Video Frame Buffering:

if (recording.video) {
players.canvas.context.fillRect(0, 0, players.video.width, players.video.height);
players.canvas.context.drawImage(players.video.element, 0, 0, players.video.width, players.video.height);
    frames.push({
        duration: 100,
        image: players.canvas.element.toDataURL('image/webp')
    });
    lastTime = new Date().getTime();
    requestAnimationFrame(drawFrame);
} else {
    requestAnimationFrame(getBlob);
}

Update: I've since managed to stop the two operations from completely blocking one another, but it's still doing it enough to distort my audio.


回答1:


There are a few key things that allow for successful getUserMedia recording in Chrome at the moment, as taken from a conglomeration of information gleaned from the helpful comments attached to the original question and my own experience.

  1. When harvesting data from the recording canvas, encode as jpeg. I had been attempting webp to satisfy the requirements of Whammy.js. Generating a webp dataURI is apparently a cycle hog.
  2. Delegate as much of the non-DOM operations as possible to worker threads. This is especially true of any streaming / upload operations (e.g., audio sample streaming via websockets)
  3. Avoid requestAnimationFrame as a means of facilitating recording canvas drawing. It is resource intensive, and as Aldel has pointed out, can fail if the user switches tabs. Using setInterval is much more efficient/reliable. It also allows for better framerate control.
  4. For Chrome at least, avoid client-side AV encoding for the time being. Stream audio samples and video frames server-side for processing. While client-side AV encoding libraries are very cool, they simply don't seem efficient enough for production quite yet.

Also, for Node.js ffmpeg automation, I highly recommend fluent-ffmpeg. Special thanks to Benjamin Trent for some practical examples.




回答2:


@aldel is right. Increasing bufferSize value fixes it. E.g. bufferSize= 16384;

Try this demo in chrome and record audio+video. You'll hear clear recorded WAV in parallel with 720p video frames.

BTW, I agree with jesup that MediaRecorder solutions should be preferred.

Chromium guys are very close and hoping M47/48 will bring MediaRecorder implementations! At least for video (vp8) recordings.

There is chrome-based alternative for whammy.js as well:

  • https://github.com/streamproc/MediaStreamRecorder/issues/43


来源:https://stackoverflow.com/questions/29217361/recording-video-simultaneously-with-audio-in-chrome-blocks-on-main-thread-causi

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!