avaudioengine

Swift 3 AVAudioEngine set microphone input format

馋奶兔 提交于 2019-12-06 10:50:03
I want to process the bytes read from the microphone using Swift 3 on my iOS. I currently use AVAudioEngine. print(inputNode.inputFormat(forBus: bus).settings) print(inputNode.inputFormat(forBus: bus).formatDescription) This gives me the following output: ["AVNumberOfChannelsKey": 1, "AVLinearPCMBitDepthKey": 32, "AVSampleRateKey": 16000, "AVLinearPCMIsNonInterleaved": 1, "AVLinearPCMIsBigEndianKey": 0, "AVFormatIDKey": 1819304813, "AVLinearPCMIsFloatKey": 1] <CMAudioFormatDescription 0x14d5bbb0 [0x3a5fb7d8]> { mediaType:'soun' mediaSubType:'lpcm' mediaSpecific: { ASBD: { mSampleRate: 16000

AVAudioEngine inputNode's format changes when playing an AVAudioPlayerNode

随声附和 提交于 2019-12-05 15:47:39
I'll start with a simple "playground" view controller class I've made that demonstrates my problem: class AudioEnginePlaygroundViewController: UIViewController { private var audioEngine: AVAudioEngine! private var micTapped = false override func viewDidLoad() { super.viewDidLoad() configureAudioSession() audioEngine = AVAudioEngine() } @IBAction func toggleMicTap(_ sender: Any) { guard let mic = audioEngine.inputNode else { return } if micTapped { mic.removeTap(onBus: 0) micTapped = false return } stopAudioPlayback() let micFormat = mic.inputFormat(forBus: 0) print("installing tap: \(micFormat

How do I create an AUAudioUnit that implements multiple audio units?

这一生的挚爱 提交于 2019-12-05 15:35:54
In Apple's docs for creating an AUAudio Unit (Here: https://developer.apple.com/documentation/audiotoolbox/auaudiounit/1387570-initwithcomponentdescription ) they claim that A single audio unit subclass may implement multiple audio units—for example, an effect that can also function as a generator, or a cluster of related effects. There are no examples of this online that I can find. Ideally it would be nice if your answer/solution involved using Swift and AVAudioEngine but I'd happily accept any answer that gets me moving in the right direction. Thanks in advance. I posted some source code to

Connecting AVAudioMixerNode to AVAudioEngine

☆樱花仙子☆ 提交于 2019-12-04 17:38:35
I use AVAudioMixerNode to change audio format. this entry helped me a lot. Below code gives me data i want. But i hear my own voice on phone's speaker. How can i prevent it? func startAudioEngine() { engine = AVAudioEngine() guard let engine = engine, let input = engine.inputNode else { // @TODO: error out return } let downMixer = AVAudioMixerNode() //I think you the engine's I/O nodes are already attached to itself by default, so we attach only the downMixer here: engine.attach(downMixer) //You can tap the downMixer to intercept the audio and do something with it: downMixer.installTap(onBus:

AVAudioEngine seek the time of the song

眉间皱痕 提交于 2019-12-04 13:53:36
问题 I am playing a song using AVAudioPlayerNode and I am trying to control its time using a UISlider but I can't figure it out how to seek the time using AVAUdioEngine . 回答1: After MUCH trial and error I think I have finally figured this out. First you need to calculate the sample rate of your file. To do this get the last render time of your AudioNode: var nodetime: AVAudioTime = self.playerNode.lastRenderTime var playerTime: AVAudioTime = self.playerNode.playerTimeForNodeTime(nodetime) var

SKAudioNode() crashes when plugging in/out headphones

雨燕双飞 提交于 2019-12-04 02:29:58
I am using a SKAudioNode() to play background music in my game. I have a play/pause function and everything is working fine until I plug in my headphones. There is no sound at all and when I call the pause/play function I get this error AVAudioPlayerNode.mm:333: Start: required condition is false: _engine->IsRunning() com.apple.coreaudio.avfaudio', reason: 'required condition is false: _engine->IsRunning() Does anyone knows what this means? Code: import SpriteKit class GameScene: SKScene { let loop = SKAudioNode(fileNamed: "gameloop.mp3") let play = SKAction.play() let pause = SKAction.pause()

AVAudioEngine seek the time of the song

谁说我不能喝 提交于 2019-12-03 08:44:17
I am playing a song using AVAudioPlayerNode and I am trying to control its time using a UISlider but I can't figure it out how to seek the time using AVAUdioEngine . After MUCH trial and error I think I have finally figured this out. First you need to calculate the sample rate of your file. To do this get the last render time of your AudioNode: var nodetime: AVAudioTime = self.playerNode.lastRenderTime var playerTime: AVAudioTime = self.playerNode.playerTimeForNodeTime(nodetime) var sampleRate = playerTime.sampleRate Then, multiply your sample rate by the new time in seconds. This will give

SWIFT - Is it possible to save audio from AVAudioEngine, or from AudioPlayerNode? If yes, how?

可紊 提交于 2019-12-03 07:50:41
问题 I've been looking around Swift documentation to save an audio output from AVAudioEngine but I couldn't find any useful tip. Any suggestion? Solution I found a way around thanks to matt's answer. Here a sample code of how to save an audio after passing it through an AVAudioEngine (i think that technically it's before) newAudio = AVAudioFile(forWriting: newAudio.url, settings: nil, error: NSErrorPointer()) //Your new file on which you want to save some changed audio, and prepared to be bufferd

AVAudioSequencer Causes Crash on Deinit/Segue: 'required condition is false: outputNode'

陌路散爱 提交于 2019-12-02 09:47:02
The below code causes a crash with the following errors whenever the object is deinitialized (e.g. when performing an unwind segue back to another ViewController): required condition is false: [AVAudioEngineGraph.mm:4474:GetDefaultMusicDevice: (outputNode)] Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: outputNode' The AVAudioSequencer is the root of the issue, because the error ceases if this is removed. How can this crash be avoided? class TestAudioClass { private var audioEngine: AVAudioEngine private var sampler:

Swift: Trying to control time in AVAudioPlayerNode using UISlider

点点圈 提交于 2019-12-02 09:04:46
问题 I'm using an AVAudioPlayerNode attached to an AVAudioEngine to play a sound. to get the current time of the player I'm doing this: extension AVAudioPlayerNode { var currentTime: TimeInterval { get { if let nodeTime: AVAudioTime = self.lastRenderTime, let playerTime: AVAudioTime = self.playerTime(forNodeTime: nodeTime) { return Double(playerTime.sampleTime) / playerTime.sampleRate } return 0 } } } I have a slider that indicates the current time of the audio. When the user changes the slider