问题
For my project I record user audio using MediaRecorder and it almost works fine. My problem rises when I wish to display a waveform of the user recording using Wavesurfer.js, which doesn't load my recording. Playing the recording with an Audio element works fine, though.
After trying different sources, it seams that it is because the final .webm file doesn't have much metadata, not even a duration or bitrate (even though I set it in the MediaRecorder options). Here is the output from ffprobe with one of the files:
Input #0, matroska,webm, from '206_3.webm':
Metadata:
encoder : Chrome
Duration: N/A, start: 0.000000, bitrate: N/A
Stream #0:0(eng): Audio: opus, 48000 Hz, mono, fltp (default)
So my question is: am I doing something wrong to record the audio? Here is how I start the recording:
// Somewhere in the code...
this._handleUserMedia(await navigator.mediaDevices.getUserMedia({ audio: true }));
// ... and elsewhere
_handleUserMedia(stream) {
this._mediaRecorder = new MediaRecorder(stream, { audioBitsPerSecond : 64000 });
this._mediaRecorder.ondataavailable = event => {
this._mediaBuffer.push(event.data);
};
this._mediaRecorder.onstop = () => {
// Ajoute le buffer et une URL vers le buffer dans les résultats pour la sauvegarde et le playback
let blob = new Blob(this._mediaBuffer, { type: "audio/webm" });
this.state.results[this.state.currentWordIdx].recordingBlob = blob;
this.state.results[this.state.currentWordIdx].recordingUrl = URL.createObjectURL(blob);
// Réinitialise le buffer pour l'enregistrement suivant
this._mediaBuffer = [];
this._gotoNextWord();
};
this._gotoNextWord();
}
As you can see I create a blob which I save later on with NodeJS's fs.writeFile
. Then when I need to display the waveform, I load the file using fs.readFile
like this:
fs.readFile(`${this.getAppData()}/${filePath}`, (err, buffer) => {
if (err) { reject(err); }
const blob = new Blob([buffer], {type : 'audio/webm'});
resolve(URL.createObjectURL(blob)); // Si besoin d'un ArrayBuffer => toArrayBuffer(buffer)
});
回答1:
I believe the reason there are so few metadata is that by default, MediaRecorder will produce a variable bit rate file, for which a single bitrate value is not meaningful - presumably (although I'm not sure) leading to lack of clear duration value as well.
The spec has recently allowed to set a constant bitrate for recording, with an implementation soon to land in Chromium (M89)
来源:https://stackoverflow.com/questions/52186254/no-metadata-when-recording-an-audio-webm-with-mediarecorder