avaudioengine

Build a simple Equalizer

旧巷老猫 提交于 2020-06-25 00:57:26
问题 I would like to make a 5-band audio equalizer (60Hz, 230Hz, 910Hz, 4kHz, 14kHz) using AVAudioEngine . I would like to have the user input gain per band through a vertical slider and accordingly adjust the audio that is playing. I tried using AVAudioUnitEQ to do this, but I hear no difference when playing the audio. I tried to hardcode in values to specify a gain at each frequency, but it still does not work. Here is the code I have: var audioEngine: AVAudioEngine = AVAudioEngine() var

Can an audio unit (v3) replace inter-app audio to send audio to a host app?

浪子不回头ぞ 提交于 2020-06-17 07:36:50
问题 My music performance app plays audio with AVAudioEngine , and uses inter-app audio to publish the engine's output to other apps. This allows users to feed the audio into a mixer app running on the same device. Since IAA is deprecated on iOS and not supported on Mac, I'm trying to replace this functionality with Audio Units. I've added an audio unit extension of type augn using the Xcode template, and I understand the internalRenderBlock is what actually returns the audio data. But how can the

Does AUGraph deprecation means no more audio render callbacks?

北城余情 提交于 2020-06-12 08:17:30
问题 I have an app with an elaborated render callback that I doubt could do with AVAudioEngine. Anyway to use my AUGraph render callback ( with multiple buses ) with AVAudioEngine ? Any sample code ? 回答1: The Audio Unit API is not deprecated, only AUGraph which is presumably built on top of it. Make connections using AudioUnitSetProperty with kAudioUnitProperty_MakeConnection with an AudioUnitConnection struct. Start and stop your output unit with AudioOutputUnitStart and AudioOutputUnitStop. Set

Does AUGraph deprecation means no more audio render callbacks?

余生长醉 提交于 2020-06-12 08:17:07
问题 I have an app with an elaborated render callback that I doubt could do with AVAudioEngine. Anyway to use my AUGraph render callback ( with multiple buses ) with AVAudioEngine ? Any sample code ? 回答1: The Audio Unit API is not deprecated, only AUGraph which is presumably built on top of it. Make connections using AudioUnitSetProperty with kAudioUnitProperty_MakeConnection with an AudioUnitConnection struct. Start and stop your output unit with AudioOutputUnitStart and AudioOutputUnitStop. Set

Using AVAudioEngine to schedule sounds for low-latency metronome

倾然丶 夕夏残阳落幕 提交于 2020-06-09 17:44:58
问题 I am creating a metronome as part of a larger app and I have a few very short wav files to use as the individual sounds. I would like to use AVAudioEngine because NSTimer has significant latency problems and Core Audio seems rather daunting to implement in Swift. I'm attempting the following, but I'm currently unable to implement the first 3 steps and I'm wondering if there is a better way. Code outline: Create an array of file URLs according to the metronome's current settings (number of

Error when installing a tap on audio engine input node

这一生的挚爱 提交于 2020-05-13 19:20:40
问题 whenever the code reaches inputNode.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) {[weak self] (buffer:AVAudioPCMBuffer, when:AVAudioTime) , app is crashing with following error Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: format.sampleRate == hwFormat.sampleRate' I tired removing taps before adding another and I'm making sure I'm not adding more than one tap. what is weird is that the app is working fine

Error when installing a tap on audio engine input node

走远了吗. 提交于 2020-05-13 19:18:05
问题 whenever the code reaches inputNode.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) {[weak self] (buffer:AVAudioPCMBuffer, when:AVAudioTime) , app is crashing with following error Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: format.sampleRate == hwFormat.sampleRate' I tired removing taps before adding another and I'm making sure I'm not adding more than one tap. what is weird is that the app is working fine

AVAudioSinkNode with non-default, but still device-native sample rates

喜夏-厌秋 提交于 2020-04-17 07:17:09
问题 I've configured AVAudioSinkNode attached to AVAudioEngine 's inputNode like so: let sinkNode = AVAudioSinkNode() { (timestamp, frames, audioBufferList) -> OSStatus in print("SINK: \(timestamp.pointee.mHostTime) - \(frames) - \(audioBufferList.pointee.mNumberBuffers)") return noErr } audioEngine.attach(sinkNode) audioEngine.connect(audioEngine.inputNode, to: sinkNode, format: nil) audioEngine.prepare() do { try audioEngine.start() print("AudioEngine started.") } catch { print("AudioEngine did

How to select audio input device (mic) in AVAudioEngine on macOS / swift?

空扰寡人 提交于 2020-01-25 08:03:45
问题 Is it possible to select the input device in AVAudioEngine using Swift on macOS? Use case: I am using SFSpeechRecognizer on macOS. To feed microphone data into it I am using private let audioEngine = AVAudioEngine() : let inputNode = audioEngine.inputNode let recordingFormat = inputNode.outputFormat(forBus: 0) inputNode.installTap( onBus: 0, bufferSize: 1024, format: recordingFormat ) { (buffer: AVAudioPCMBuffer, when: AVAudioTime) in self.recognitionRequest?.append( buffer ) } audioEngine

How to change audio pitch during playback? (Swift 4)

半世苍凉 提交于 2020-01-23 01:44:28
问题 I'm looking to change the pitch and playback speed of some audio in real time with a slider, or a variable (i.e. while the sound is playing) in Xcode, Swift 4. Currently, I'm using AVAudioEngine, which allows me to set these values before playback starts, but I can't change them while I the audio is actually playing. Here is the code in question: func Play() { engine = AVAudioEngine() audioPlayer = AVAudioPlayerNode() audioPlayer.volume = 1.0 let path = Bundle.main.path(forResource: "filename