Swift: Sound-Output & Microphone-Input | using AudioKit |

匿名 (未验证) 提交于 2019-12-03 01:33:01

问题:


I'm using >Xcode Version 9.2<
I'm using >AudioKit Version 4.0.4<


I've written some code you can find below that should be able to

  • play a specific sound (frequency: 500.0HZ)
  • "listen" to the microphone input and calculating the frequency in real time

If I'm calling playSound() or receiveSound() separated everything looks fine and is really working as I expected. But calling playSound() and receiveSound() afterwards? Exactly there I got big issues.

This is how I'd like to get the code working:

SystemClass.playSound() //play sound DispatchQueue.main.asyncAfter(deadline: (DispatchTime.now() + 3.0)) {    SystemClass.receiveSound() //get microphone input 3 seconds later } 

let SystemClass: System = System() class System {     public init() { }      func playSound() {         let sound = AKOscillator()         AudioKit.output = sound         AudioKit.start()         sound.frequency = 500.0         sound.amplitude = 0.5         sound.start()         DispatchQueue.main.asyncAfter(deadline: (DispatchTime.now() + 2.0)) {             sound.stop()         }     }       var tracker: AKFrequencyTracker!     func receiveSound() {         AudioKit.stop()         AKSettings.audioInputEnabled = true         let mic = AKMicrophone()         tracker = AKFrequencyTracker(mic)         let silence = AKBooster(tracker, gain: 0)         AudioKit.output = silence         AudioKit.start()         Timer.scheduledTimer( timeInterval: 0.1, target: self, selector: #selector(SystemClass.outputFrequency), userInfo: nil, repeats: true)     }      @objc func outputFrequency() {         print("Frequency: \(tracker.frequency)")     } } 

These messages are some of the compiler error-messages I get every time I'd like to run the code (calling playSound() and calling receiveSound () 3 seconds later):

AVAEInternal.h:103:_AVAE_CheckNoErr: [AVAudioEngineGraph.mm:1266:Initialize: (err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)): error -10875  AVAudioEngine.mm:149:-[AVAudioEngine prepare]: Engine@0x1c401bff0: could not initialize, error = -10875  [MediaRemote] [AVOutputContext] WARNING: AVF context unavailable for sharedSystemAudioContext  [AVAudioEngineGraph.mm:1266:Initialize: (err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)): error -10875  Fatal error: AudioKit: Could not start engine. error: Error   Domain=com.apple.coreaudio.avfaudio Code=-10875 "(null)" UserInfo={failed call=err = AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainOptimizedTraversal, *GetOutputNode(), isOutputChainActive)}.: file /Users/megastep/src/ak/AudioKit/AudioKit/Common/Internals/AudioKit.swift, line 243 

回答1:

I believe the lionshare of your problems are due to local declaration of AKNodes within the functions that use them:

   let sound = AKOscillator()    let mic = AKMicrophone()            let silence = AKBooster(tracker, gain: 0) 

Declare these as instance variables instead, as described here.



标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!