web-audio-api

How do you combine many audio tracks into one for mediaRecorder API?

无人久伴 提交于 2020-08-07 05:43:49
问题 I want to make a recording where, I get multiple audio tracks from different mediaStream objects (some of them, remote). Use the getAudioTracks () method and add them to a mediaStream object using addTrack (). At the moment of passing this last object as a parameter for mediaRecorder I realize that it only records the audio track located in position [0]. That gives me to understand that mediaRecorder is capable of recording a track by type, is there any way to join these tracks into one to

How do you combine many audio tracks into one for mediaRecorder API?

大兔子大兔子 提交于 2020-08-07 05:43:11
问题 I want to make a recording where, I get multiple audio tracks from different mediaStream objects (some of them, remote). Use the getAudioTracks () method and add them to a mediaStream object using addTrack (). At the moment of passing this last object as a parameter for mediaRecorder I realize that it only records the audio track located in position [0]. That gives me to understand that mediaRecorder is capable of recording a track by type, is there any way to join these tracks into one to

Web Audio API: Stop all scheduled sounds from playing

て烟熏妆下的殇ゞ 提交于 2020-07-19 06:46:38
问题 So i have a bunch of loaded audio samples that I am calling the schedule function with in the code below: let audio; function playChannel() { let audioStart = context.currentTime; let next = 0; for(let i = 0; i < 8; i++) { scheduler(audioStart, next); next++; } } Here is the audio scheduler function: function scheduler(audioStart, index) { audio = context.createBufferSource(); audio.buffer = audioSamples[index]; //array with all the loaded audio audio.connect(context.destination); audio.start

Web Audio API: Stop all scheduled sounds from playing

与世无争的帅哥 提交于 2020-07-19 06:46:32
问题 So i have a bunch of loaded audio samples that I am calling the schedule function with in the code below: let audio; function playChannel() { let audioStart = context.currentTime; let next = 0; for(let i = 0; i < 8; i++) { scheduler(audioStart, next); next++; } } Here is the audio scheduler function: function scheduler(audioStart, index) { audio = context.createBufferSource(); audio.buffer = audioSamples[index]; //array with all the loaded audio audio.connect(context.destination); audio.start

Frontend JavaScript request gets 302-redirected but ultimately fails

ε祈祈猫儿з 提交于 2020-07-09 14:29:48
问题 I'm trying to create an audio visualization for a podcast network, using the Web Audio API with createMediaElementSource() very similarly to the model explained in this tutorial. So far I've gotten it to work fine in Chrome, and you can see it here (note: click on the red box to start it). Update : Based on discussion in the comments, it’s now become clear that the problem happens because the request gets redirected to another URL, by way of a 302 redirect. However, Safari refuses to work,

Frontend JavaScript request gets 302-redirected but ultimately fails

左心房为你撑大大i 提交于 2020-07-09 14:29:26
问题 I'm trying to create an audio visualization for a podcast network, using the Web Audio API with createMediaElementSource() very similarly to the model explained in this tutorial. So far I've gotten it to work fine in Chrome, and you can see it here (note: click on the red box to start it). Update : Based on discussion in the comments, it’s now become clear that the problem happens because the request gets redirected to another URL, by way of a 302 redirect. However, Safari refuses to work,

How to set srcObject on audio element with React

。_饼干妹妹 提交于 2020-07-08 04:56:30
问题 I've been trying to set the src attribute of an audio tag in React, but the track never plays. playTrack(track) { const stream = new MediaStream() stream.addTrack(track) this.setState(() => ({ stream })) } render() { return ( <audio src={this.state.stream || null} controls volume="true" autoPlay /> ) } When I check in the chrome debugger it shows that the audio tag has [MediaStream] set as its source, but nothing plays and all the controls remained grayed out. Doing this instead of setting

How to set srcObject on audio element with React

纵然是瞬间 提交于 2020-07-08 04:56:21
问题 I've been trying to set the src attribute of an audio tag in React, but the track never plays. playTrack(track) { const stream = new MediaStream() stream.addTrack(track) this.setState(() => ({ stream })) } render() { return ( <audio src={this.state.stream || null} controls volume="true" autoPlay /> ) } When I check in the chrome debugger it shows that the audio tag has [MediaStream] set as its source, but nothing plays and all the controls remained grayed out. Doing this instead of setting

Getting PCM data from wavesurfer.js backend/web audio api

寵の児 提交于 2020-06-27 16:25:08
问题 I am using wavesurfer.js to create a multitrack player online and want to export a remixed version of the combined tracks with levels panning etc. First I have an array of audioFiles and use this to create an array of wavesurfer elements. for(var i=0; i<audiofiles.length; i++){ spectrum[i] = WaveSurfer.create({ }); } I then create a buffer for each of these from wavesurfer backend for(var i=0; i<audiofiles.length; i++){ var ctx = spectrum[i].backend.ac; var length = spectrum[i].getDuration()

Downloading audio from web that has been modified with wavesurfer.js

走远了吗. 提交于 2020-06-16 03:37:09
问题 I have created a multitrack web player using wavesurfer.js which can adjust the levels and panning of the different tracks. What I want to do is export the mixed tracks with new levels and panning as a single .wav file. I've done a bit of research into this and alot of people are pointing to https://github.com/mattdiamond/Recorderjs but development stopped on this over 4 years ago and from what I've found it seems to have a load of issues. Just initializing it like so var rec = new Recorder