web-audio-api

How can I avoid CORS restriction for web audio api?

人盡茶涼 提交于 2019-12-01 04:32:33
问题 I trying to create some visualization for audio-stream. But I run into CORS trouble when I try get access to raw audio data with createMediaElementSource() function. Is there a way to avoid this restriction and get raw audio data from stream on other origins? Perhaps using WebSockets? 回答1: There are five ways to deal with the protections against cross-origin retrieval: CORS headers -- this is ideal, but you need the cooperation of the third-party server JSONP -- not appropriate for streaming

Why certain .wav files cannot be decoded in Firefox

半腔热情 提交于 2019-12-01 03:59:45
问题 I have a web page which decodes wave files for certain reasons. Chrome and Safari seem to work fine. Firefox occasionally is unable to decode the file and gives the error: "The buffer passed to decodeAudioData contains invalid content which cannot be decoded successfully." I have created a jsfiddle which illustrates the issue: var audioCtx = new (window.AudioContext || window.webkitAudioContext)(); var source; function getData() { source = audioCtx.createBufferSource(); request = new

Why Web Audio API isn't supported in nodejs?

試著忘記壹切 提交于 2019-12-01 03:22:22
I understand Web Audio API is a client side feature but nodejs is based on V8 Chrome client side implementation of ECMAScript, which include Web Audio API . Why there is no complete support of Web Audio API in nodejs ? Is it because AudioContext is based on the global window object ? Am I missing a point here ? Is there a plan to make it available in the future ? Node.js doesn't support Web Audio because it isn't part of the JavaScript language itself - it's a separate web platform JavaScript API . You can think of it like Web Workers, requestAnimationFrame or XMLHttpRequest - they are part of

Why Web Audio API isn't supported in nodejs?

一世执手 提交于 2019-11-30 22:55:32
问题 I understand Web Audio API is a client side feature but nodejs is based on V8 Chrome client side implementation of ECMAScript, which include Web Audio API . Why there is no complete support of Web Audio API in nodejs ? Is it because AudioContext is based on the global window object ? Am I missing a point here ? Is there a plan to make it available in the future ? 回答1: Node.js doesn't support Web Audio because it isn't part of the JavaScript language itself - it's a separate web platform

Load audiodata into AudioBufferSourceNode from <audio/> element via createMediaElementSource?

拜拜、爱过 提交于 2019-11-30 18:31:49
Is it possible to have an audiofile loaded from <audio/> -element via createMediaElementSource and then load the audio data into a AudioBufferSourceNode ? Using the audio-element as a source (MediaElementSource) seems not to be an option, as I want to use Buffer methods like noteOn and noteGrain . Loading the audiofile directly to the buffer via XHR unfortunately isn't an option neither ( see Open stream_url of a Soundcloud Track via Client-Side XHR? ) Loading the buffer contents from the audio elements seems to be possible though: http://www.w3.org/2011/audio/wiki/Spec_Differences#Reading

How can I play audio in reverse with web audio API?

帅比萌擦擦* 提交于 2019-11-30 14:53:18
问题 How can I play audio in reverse with the web audio API? I can't seem to find anything in the API docs... 回答1: You could do something like this: var context = new AudioContext(), request = new XMLHttpRequest(); request.open('GET', 'path/to/audio.mp3', true); request.responseType = 'arraybuffer'; request.addEventListener('load', function(){ context.decodeAudioData(request.response, function(buffer){ var source = context.createBufferSource(); Array.prototype.reverse.call( buffer.getChannelData(0

Phonegap mixing audio files

假如想象 提交于 2019-11-30 13:21:56
问题 I'm building a karaoke app using Phonegap for Ios. I have audio files in the www/assets folder that I am able to play using the media.play()function This allows the user to listen to the backing track. While the media is playing another Media instance is recording. Once the recording has finished I need to lay the voice recording file over the backing track and I have no idea of how I might go about doing this. One approach I thought might work is to use the WEb Audio API - I have the

Offline / Non-Realtime Rendering with the Web Audio API

你。 提交于 2019-11-30 13:21:34
The Problem I'm working on a web application where users can sequence audio samples and optionally apply effects to the musical patterns they create using the Web Audio API. The patterns are stored as JSON data, and I'd like to do some analysis of the rendered audio of each pattern server-side. This leaves me with two options, as far as I can see: Run my own rendering code server-side, trying to make it as faithful as possible to the in-browser rendering. Maybe I could even pull out the Web Audio code from the Chromium project and modify that, but this seems like potentially a lot of work. Do

Record HTML5 SpeechSynthesisUtterance generated speech to file

老子叫甜甜 提交于 2019-11-30 12:56:05
问题 I am able to generate speech from text using Chrome's Speech Synthesis API (in Version 33.0.1750.112 beta-m) in the following manner var transcript = document.getElementById("speechTxt").value; var msg = new SpeechSynthesisUtterance(transcript); speechSynthesis.speak(msg); Now I want to save this speech in a file (maybe using WebAudio API). Is this possible through some function call? I have looked at the methods in Speech Synthesis API and there is nothing to save this speech data. Using

How can I play audio in reverse with web audio API?

拈花ヽ惹草 提交于 2019-11-30 12:28:22
How can I play audio in reverse with the web audio API ? I can't seem to find anything in the API docs... You could do something like this: var context = new AudioContext(), request = new XMLHttpRequest(); request.open('GET', 'path/to/audio.mp3', true); request.responseType = 'arraybuffer'; request.addEventListener('load', function(){ context.decodeAudioData(request.response, function(buffer){ var source = context.createBufferSource(); Array.prototype.reverse.call( buffer.getChannelData(0) ); Array.prototype.reverse.call( buffer.getChannelData(1) ); source.buffer = buffer; }); }); It's a super