web-audio-api

Convert AudioBuffer to ArrayBuffer / Blob for WAV Download

╄→尐↘猪︶ㄣ 提交于 2020-06-15 05:51:07
问题 I'd like to convert an AudioBuffer to a Blob so that I can create an ObjectURL from it and then download the audio file. let rec = new Recorder(async(chunks) => { var blob = new Blob(chunks, { type: 'audio/mp3' }); var arrayBuffer = await blob.arrayBuffer(); const audioContext = new AudioContext() await audioContext.decodeAudioData(arrayBuffer, (audioBuffer) => { // How to I now convert the AudioBuffer into an ArrayBuffer => Blob ? } 回答1: An AudioBuffer contains non-interleaved Float32Array

Check if selected microphone is muted or not with web audio api

时光总嘲笑我的痴心妄想 提交于 2020-05-27 04:02:34
问题 By using the following, we can prompt the user to select their preferred media input device with audio and video source constraints (currently only interested in Chrome support). navigator.mediaDevices.getUserMedia({audio: true}) .then((stream) => { console.log(stream); }); Anyone know if there is an exposed API to detect if the user-selected input device is currently muted or not? The input device would be either an onboard microphone, external mic, or software defined microphone that shows

How to rapidly play multiple copies of a soundfile in javascript

人走茶凉 提交于 2020-05-17 06:36:32
问题 I'm building a wheel of fortune in html+js that spins rather quickly. Every time a new color flies by the mark, the wheel should play a click-sound. At top speed this sounds almost like a machine gun, so a new file starts playing before the old one is finished basically. The file itself is always the same: click.wav It works fine in Chrome, only in chrome. Firefox has a weird bug, where it only plays the sound, if there is any other audio source active, such as a youtube video playing in a

Web Audio Api, updating looping buffer data in real time, impossible in firebox?

孤街醉人 提交于 2020-05-17 06:12:07
问题 I am running into an issue when trying to change the underlying buffer data while a looping buffer is being played back. bufferData = audioContext.bufferSourceNode.buffer.getChannelData(0); bufferData[100] = newValue; This kind of behavior seems to work fine in most browsers—I've tested in Chrome, Safari, Opera, Edge, all working fine—but this doesn't seem to be possible in Firefox. This seems like a bug. I read in this StackOverflow question from 2015 that this way of updating a buffer

Clicking sounds in Stream played with Web Audio Api

为君一笑 提交于 2020-05-15 06:41:04
问题 I have a strange Problem. I'm using Web Audio to play a stream from the server. I do that the following way: var d2 = new DataView(evt.data); var data = new Float32Array(d2.byteLength / Float32Array.BYTES_PER_ELEMENT); for (var jj = 0; jj < data.length; ++jj) { data[jj] = d2.getFloat32(jj * Float32Array.BYTES_PER_ELEMENT, true); } var buffer = context.createBuffer(1, data.length, 44100); buffer.getChannelData(0).set(data); source = context.createBufferSource(); source.buffer = buffer; source

Combining audio and video tracks into new MediaStream

柔情痞子 提交于 2020-05-09 01:27:54
问题 I need to get create a MediaStream using audio and video from different MediaStreams. In Firefox, I can instantiate a new MediaStream from an Array of tracks: var outputTracks = []; outputTracks = outputTracks.concat(outputAudioStream.getTracks()); outputTracks = outputTracks.concat(outputVideoStream.getTracks()); outputMediaStream = new MediaStream(outputTracks); Unfortunately, this doesn't work in Chrome: ReferenceError: MediaStream is not defined Is there an alternative method in Chrome

Combining audio and video tracks into new MediaStream

百般思念 提交于 2020-05-09 01:24:11
问题 I need to get create a MediaStream using audio and video from different MediaStreams. In Firefox, I can instantiate a new MediaStream from an Array of tracks: var outputTracks = []; outputTracks = outputTracks.concat(outputAudioStream.getTracks()); outputTracks = outputTracks.concat(outputVideoStream.getTracks()); outputMediaStream = new MediaStream(outputTracks); Unfortunately, this doesn't work in Chrome: ReferenceError: MediaStream is not defined Is there an alternative method in Chrome

Web Audio API — squaring a signal by using a Gain

纵饮孤独 提交于 2020-02-06 08:07:27
问题 Should it be possible to square a signal by creating a Gain instance and connecting the signal both to the gain input and amplitude control parameter? Because I am seeing odd results at least in Firefox. I can see that Tone.js uses a wave-shaper instead for a pow operation, so perhaps this is the way to go. But I'm curious, since the API says the gain parameter is audio-rate, obviously there must be some delays involved. 回答1: This works for me: var c = new AudioContext(); var o = c

Web Audio API — squaring a signal by using a Gain

孤人 提交于 2020-02-06 08:06:26
问题 Should it be possible to square a signal by creating a Gain instance and connecting the signal both to the gain input and amplitude control parameter? Because I am seeing odd results at least in Firefox. I can see that Tone.js uses a wave-shaper instead for a pow operation, so perhaps this is the way to go. But I'm curious, since the API says the gain parameter is audio-rate, obviously there must be some delays involved. 回答1: This works for me: var c = new AudioContext(); var o = c

html5 audio mp3 not working in firefox

≯℡__Kan透↙ 提交于 2020-02-03 08:28:07
问题 I have this jsFiddle which works perfectly in Chrome and Safari but does not work in firefox. Sample code: <!DOCTYPE html> <html> <head> <title>Simple Media Player</title> <style> body, div { margin: 0; } </style> </head> <body> <audio preload="auto" controls autoplay> <source src="http://dev-audio-test.s3-website-us-east-1.amazonaws.com/08xCf21_niXjQmGmanVUrR0Tk2h2mKSMw_sxg03CrycaxhNiqhX9_NFYhHBw7eJcp_ru52kdQRW88YigtmTE0w==.mp3" type="audio/mpeg" /> </audio> </body> </html> I have set up a