web-audio-api

Id tag causes audio to not play in chrome

我的未来我决定 提交于 2019-12-13 13:38:26
问题 I have an audio element on a webpage and I need to add an id tag "player" to the element. I am using getElementById("player"). The element in quesiton: <audio id="player" controls loop> <source src="Out_of_the_Skies_Under_the_Earth.mp3" type="audio/mp3"> </audio> I am using the 'player' tag to make the next two lines useful, using the Web Audio API: var mediaElement = document.getElementById('player'); var sourceNode = context.createMediaElementSource(mediaElement); This is the only place I

Is it possible to play audio from a local ios library via html5?

匆匆过客 提交于 2019-12-13 05:09:23
问题 Say we have an iPhone that has mp3s synced to it that play normally with the built in ios music player. Can a webpage be coded in a way that if I surf to it using mobile safari, I can see my music library and play the files? If so what APIs etc should I use? 回答1: On iOS, this is not possible. It is possible for Chrome Applications to get access to the local media library; however, iOS (even Chrome on Android right now, actually) do not support this. 回答2: As browser limitations and security

How to change playback speed of live audio from microphone (using a buffer)?

蓝咒 提交于 2019-12-13 04:54:20
问题 I have heard that by changing playback speed we can modify the frequency of the audio. I have tested it here : https://teropa.info/blog/2016/08/10/frequency-and-pitch.html But the problem is that I need a recorded audio file to do that. From what I have found web audio can't change playback speed of live audio. I have been thinking that if we save the audio in buffer we can change its playback speed thus changing the frequency. I am new to web-audio API. I have found an article that records

WebAudio changing of orientation of Listener and/or Panner

隐身守侯 提交于 2019-12-13 03:53:32
问题 I am trying to understand how the WebAudio API would work. I have two objects; one representing the listener and one the source. And have used the below link as an example. I am able to move the source and the sound position changes. https://mdn.github.io/webaudio-examples/panner-node/ The command to change the orientation has been provided: viz this.panner.setOrientation or this.listener.setOrientation . The question I have is: if I have a source or listener object (in canvas mode using

Cannot analyse Soundcloud's streaming audio because of the lack of CORS policy

萝らか妹 提交于 2019-12-13 02:38:53
问题 I am working on this visualizer http://trif.it that still works well on Chrome Stable (41.x) but stopped working in Chrome Dev, Beta and Canary (42.x onwards) because of a change in how Chrome (and Firefox before that) handles audio sources to be analysed. Here is the code that is problematic. It should work well until you remove the comments on the last portion that is handling the audio routing. var audioElement = new Audio(); var clientId = "xxxxx"; var oReq = new XMLHttpRequest(); //oReq

Android Chrome 39 getByteFrequencyData returns 0 array

我是研究僧i 提交于 2019-12-13 02:18:26
问题 I have setup example here - http://jsbin.com/hotovu/2/ On desktop chrome all is good. Android Chrome 39 returns all the arrays to 0,0,0, ... ( on ADB plugin for chrome debug ) Any workaround to make this work ? 回答1: This is a known bug. http://crbug.com/419446. It's not the Analyser, it's the media element. If you can load it with XHR into a buffer and play it, it will work. 来源: https://stackoverflow.com/questions/27552704/android-chrome-39-getbytefrequencydata-returns-0-array

How to create a live media stream with Javascript

被刻印的时光 ゝ 提交于 2019-12-13 00:19:48
问题 I am wanting to create a live audio stream from one device to a node server which can then broadcast that live feed to several front ends. I have searched extensively for this and have really hit a wall so hoping somebody out there can help. I am able to get my audio input from the window.navigator.getUserMedia API. getAudioInput(){ const constraints = { video: false, audio: {deviceId: this.state.deviceId ? {exact: this.state.deviceId} : undefined}, }; window.navigator.getUserMedia(

AudioWorklet DOMException error when loading modules

笑着哭i 提交于 2019-12-12 20:22:23
问题 I'm working on a WebAudio application that requires AudioWorklets and needs functions from many different scripts to be used in the process() function. Therefore, I'm trying to load said scripts in the processor script ( frictionProcessor.js ) with the import command as shown: import {MAX_ERROR, MAX_ITERATIONS} from "./utilities.js"; class FrictionProcessor extends AudioWorkletProcessor {...} registerProcessor('frictionProcessor', FrictionProcessor); where utilities.js is: //Constants const

ChannelMergerNode in Web audio API not merging channels

元气小坏坏 提交于 2019-12-12 16:19:41
问题 I'm trying to use the web audio API to create an audio stream with the left and right channels generated with different oscillators. The output of the left channel is correct, but the right channel is 0. Based on the spec, I can't see what I'm doing wrong. Tested in Chrome dev. Code: var context = new AudioContext(); var l_osc = context.createOscillator(); l_osc.type = "sine"; l_osc.frequency.value = 100; var r_osc = context.createOscillator(); r_osc.type = "sawtooth"; r_osc.frequency.value =

web audio api: multiply waves

混江龙づ霸主 提交于 2019-12-12 13:46:55
问题 The Web Audio API lets me create a constant sine wave in a specified frequence signal like this: var actx = new AudioContext(); var osc = actx.createOscillator(); osc.frequency.value = 500; osc.connect(actx.destination); osc.start(); How can I multiply this wave by another wave in order to "shape" it. For example how could I multiply it by another sine wave of 200 Hz. Like so: 回答1: Try something like var osc1 = context.createOscillator(); var osc2 = context.createOscillator(); var gain =