web-audio-api

How can I capture the AudioDestinationNode value when headphones are plugged in with Web Audio API?

Deadly 提交于 2019-12-23 02:21:22
问题 I've been looking for a solution that detects the difference between the default speakers and headphones on a computer. I understand that with Web Audio API, AudioDestinationNode represents the output device, where users hear audio. My question (to be specific) is whether or not it is possible to detect a change in the users' audio output device (wired/wireless headphones). If this is not possible, is there a way to use phonegap to do so, for computers as well as mobile devices? My goal is to

Using Web Audio API decodeAudioData with external binary data

痴心易碎 提交于 2019-12-22 17:50:04
问题 I've searched related questions but wasn't able to find any relevant info. I'm trying to get the Web Audio API to play an mp3 file which is encoded in another file container, so what I'm doing so far is parsing said container, and feeding the result binary data (arraybuffer) to the audioContext.decodeAudioData method, which supposedly accepts any kind of arraybuffer containing audio data. However, it always throws the error callback. I only have a faint grasp of what I'm doing so probably the

iOS6/7 stop sound going to background using web audio API

限于喜欢 提交于 2019-12-22 13:33:02
问题 There are different solutions for the issue when you go to the background in the iPhone or iPad and the sound continuous playing, the most of them for the HMTL5 audio tag, but are not relevant if you are using Web Audio API because there are not an event like "timeupdate" and is a different concept of course. The Page Visibility API works in iOS7 only if you change of tab, but doesn't if you go to the background, in iOS6 not at all. Someone knows any way to stop/mute a sound using Web Audio

iOS6/7 stop sound going to background using web audio API

三世轮回 提交于 2019-12-22 13:32:49
问题 There are different solutions for the issue when you go to the background in the iPhone or iPad and the sound continuous playing, the most of them for the HMTL5 audio tag, but are not relevant if you are using Web Audio API because there are not an event like "timeupdate" and is a different concept of course. The Page Visibility API works in iOS7 only if you change of tab, but doesn't if you go to the background, in iOS6 not at all. Someone knows any way to stop/mute a sound using Web Audio

Audio Level Meter for Web RTC Stream

烂漫一生 提交于 2019-12-22 09:48:30
问题 I would like to create a decibel meter for the audio that is playing in a video element. The video element is playing a WebRTC stream. At the moment WebRTC streams cannot be passed into a Web Audio Analyzer. (Although this might change soon … ) (see Web Audio API analyser node getByteFrequencyData returning blank array) Is there currently another way to get decibel information from a remote mediastream? 回答1: Chrome 50 was released: As of the 13th of April 2016 using an Analyser Node with a

How to release memory using Web Audio API?

本小妞迷上赌 提交于 2019-12-22 09:06:03
问题 var context = new window.AudioContext() var request = cc.loader.getXMLHttpRequest(); request.open("GET", 'res/raw-assets/resources/audio/bgm.mp3', true); request.responseType = "arraybuffer"; request.onload = function () { context["decodeAudioData"](request.response, function(buffer){ //success cc.log('success') window.buffer = buffer playBgm() }, function(){ //error }); }; request.onerror = function(){ //error }; request.send(); function playBgm(){ var audio = context["createBufferSource"]()

How to disable Web-Audio analyzer filtering high frequencies

时间秒杀一切 提交于 2019-12-21 20:36:48
问题 I am studying the html5 audio API. I have noticed the analysis module has problems processing high frequencies. It is as if there is a build in filter in it. For example, if I emitting a 20Khz tone and plot the outcome of getFloatFrequencyData I see the following spectrum: However, if I use Audacity, the same signal looks like this: (notice the peak @ 20khz) Can I disable the built in filter of the analysis model? p.s. the sampling rate is high enough according to the context canvas so I

preloading the next song in a playlist a bit before the current one ends

蓝咒 提交于 2019-12-21 19:55:34
问题 I've made a small media player that works fine but I want to make it so that there's no more loading in between each songs I know about the preload property but it only preloads the music when the page loads for the first time, so I feel like this wont work is there a way to do this at all? maybe using the web audio API? 回答1: When you start playing a song you could watch the play event of the audio and already start preloading the next song in the queue. This is the function I use for

AudioContext.createMediaStreamSource alternative for iOS?

旧巷老猫 提交于 2019-12-21 16:58:40
问题 I've developed an app using Cordova and the Web Audio API, that allows the user to plug in headphones, press the phone against their heart, and hear their own heartbeat. It does this by using audio filter nodes. //Setup userMedia context = new (window.AudioContext||window.webkitAudioContext); navigator.getUserMedia = (navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia); navigator.getUserMedia( {audio:true}, userMediaSuccess,

Animate object by sound amplitude?

本小妞迷上赌 提交于 2019-12-21 15:46:09
问题 I know it's possible to animate an object with sound via Actionscript. I am really hoping it is also possible to animate an object with JavaScript since they are very similar. Maybe it could be done with jQuery or HTML5. I am just hoping to find a way to do it outside of flash. Does anyone know if this is possible in any of these formats? I have done lots of research and can't seem to find any forms or tutorials that say it is possible or not. BASICALLY, I am trying to achieve this same