web-audio-api

html5/javascript audio play multiple tracks at the same time

风流意气都作罢 提交于 2019-12-20 02:40:10
问题 Is it possible to play the whole audio instance at the same time? I know, I could create every time an new instance and call .play() on every instance, but I think its dirty. like sound1 = new Audio('sound1.mp3'); sound2 = new Audio('sound2.mp3); sound1.play(); sound2.play; Its more or less the same as in this thread: Play multiple sound at the same time But there has to be and elegant way? 回答1: Edit: MediaController and mediagroup turned out to be vaporware. You will want to stick with the

SoundCloud Api redirect confusion and Audio Api Streams

血红的双手。 提交于 2019-12-19 11:15:15
问题 I am attempting to make a request to the SoundCloud API. Then when I get the response I set the stream_url as the source of an < audio > element. This works: http://matthiasdv.org/beta/ But not always... When you search for 'Bonobo' for example, you can play the first few tracks without any issue. But when you try to play 'London Grammar - Hey Now (Bonobo remix)' - the 7th result - it won't play. It throws no errors whatsoever. I've been tinkering around with Chrome's webdev-tools and under

WebAudio streaming with fetch : DOMException: Unable to decode audio data

佐手、 提交于 2019-12-19 10:03:30
问题 I'm trying to play an infinite stream coming from the fetch API using Chrome 51. (a webcam audio stream as Microsoft PCM, 16 bit, mono 11025 Hz) The code works almost OK with mp3s files, except some glitches, but it does not work at all with wav files for some reason i get "DOMException: Unable to decode audio data" The code is adapted from this answer Choppy/inaudible playback with chunked audio through Web Audio API Any idea if its possible to make it work with WAV streams ? function play

webaudio not working with ionic using crosswalk on android device

为君一笑 提交于 2019-12-19 09:06:03
问题 I created an ionic webrtc app that runs perfectly when using ionic serve (on web browser, witch is normal). but was not working at all on the device since the getUserMedia function was not able to execute. The solution i found is to install crosswalk, update permissions in the AndroidManifest.xml and add the meta tag on the index.html for content security: <meta http-equiv="Content-Security-Policy: media-src: 'self' mediastream"> Now, i have a working ionic webrtc app but only video, the

How do I configure a bandpass filter?

拜拜、爱过 提交于 2019-12-19 08:05:15
问题 I'm trying to use the Web Audio API's bandpass filter functionality, but I believe my question is more general. I don't understand the "Q" value of the bandpass filter. I would like to be able to configure the filter to pass frequencies that are within Y hertz of a middle frequency X hertz. I'm very new to audio programming, so are there other variables I need to consider to compute Q? 回答1: Let's say you have a filter at 1000Hz, and you want it to start at 500Hz and end at 2000Hz. First off,

How do I configure a bandpass filter?

旧巷老猫 提交于 2019-12-19 08:05:13
问题 I'm trying to use the Web Audio API's bandpass filter functionality, but I believe my question is more general. I don't understand the "Q" value of the bandpass filter. I would like to be able to configure the filter to pass frequencies that are within Y hertz of a middle frequency X hertz. I'm very new to audio programming, so are there other variables I need to consider to compute Q? 回答1: Let's say you have a filter at 1000Hz, and you want it to start at 500Hz and end at 2000Hz. First off,

Offline / Non-Realtime Rendering with the Web Audio API

痞子三分冷 提交于 2019-12-18 15:53:35
问题 The Problem I'm working on a web application where users can sequence audio samples and optionally apply effects to the musical patterns they create using the Web Audio API. The patterns are stored as JSON data, and I'd like to do some analysis of the rendered audio of each pattern server-side. This leaves me with two options, as far as I can see: Run my own rendering code server-side, trying to make it as faithful as possible to the in-browser rendering. Maybe I could even pull out the Web

Specifying codecs with MediaRecorder

浪子不回头ぞ 提交于 2019-12-18 04:18:50
问题 How can I specify the codecs used with the MediaRecorder API? The only option I see is for mimeType which isn't really sufficient. Cramming in the codecs in the mimeType option doesn't seem to work. var mediaRecorder = new MediaRecorder( outputMediaStream ), { mimeType: 'video/webm; codecs="opus,vp8"' } ); This results in a WebM stream with Vorbis and VP8: FFMPEG STDERR: Input #0, matroska,webm, from 'pipe:': Metadata: encoder : QTmuxingAppLibWebM-0.0.1 Duration: N/A, start: 0.000000, bitrate

Web audio API: scheduling sounds and exporting the mix

China☆狼群 提交于 2019-12-18 02:49:38
问题 I've been checking Web Audio API documentation and the tutorials but haven't quiet figured out how to approach this problem. Let's say I load few wav files via XMLHttpRequest and then create buffersources. I know I can schedule when the playback starts precisely. But what if I don't want to play them, but instead want to store and schedule them in a buffer. A real example: I want to create a simple sequencer where you schedule drums and than export the whole mix to wav (without recording it

Firefox 25 and AudioContext createJavaScriptNote not a function

主宰稳场 提交于 2019-12-17 21:35:13
问题 Firefox 25 says to bring Web Audio but an important functions seems to be missing - createJavaScriptNode. I'm trying to build a analyser but I get the error in console that createJavaScriptNode is not a function. Demo - http://jsbin.com/olugOri/3/edit 回答1: You can try using createScriptProcessor instead. Firefox is still not getting correct values, but at least that error is no longer present. Demo - http://jsbin.com/olugOri/4/edit Edit : (more visibility for the important discussion in the