What is a correct way to manage AudioKit's lifecycle?

只谈情不闲聊 提交于 2019-12-03 16:19:26

Alexey,

My recommendation for managing AudioKit's lifecycle is to house it within a singleton class. This is how it's set up in some of the AudioKit examples included in the repo, such as Analog Synth X and Drums. That way, it's not bound to a specific ViewController's viewDidLoad and can be accessed from multiple ViewControllers or the AppDelegate that manages the app's state. It also ensures that you will only create one instance of it.

Here's an example where AudioKit is initialized within a class called, Conductor (could also be called AudioManager, etc):

import AudioKit
import AudioKitUI

// Treat the conductor like a manager for the audio engine.
class Conductor {

    // Singleton of the Conductor class to avoid multiple instances of the audio engine
    static let sharedInstance = Conductor()

    // Create instance variables
    var mic: AKMicrophone!
    var tracker: AKAmplitudeTracker!

    // Add effects
    var delay: AKDelay!
    var reverb: AKCostelloReverb!

    // Balance between the delay and reverb mix.
    var reverbAmountMixer = AKDryWetMixer()

    init() {

        // Allow audio to play while the iOS device is muted.
        AKSettings.playbackWhileMuted = true

        AKSettings.defaultToSpeaker = true

        // Capture mic input
        mic = AKMicrophone()

        // Pull mic output into the tracker node.
        tracker = AKAmplitudeTracker(mic)

        // Pull the tracker output into the delay effect node.
        delay = AKDelay(tracker)
        delay.time = 2.0
        delay.feedback = 0.1
        delay.dryWetMix = 0.5

        // Pull the delay output into the reverb effect node.
        reverb = AKCostelloReverb(delay)
        reverb.presetShortTailCostelloReverb()

        // Mix the amount of reverb to the delay output node.
        reverbAmountMixer = AKDryWetMixer(delay, reverb, balance: 0.8)

        // Assign the reverbAmountMixer output to be the final audio output
        AudioKit.output = reverbAmountMixer

        // Start the AudioKit engine
        // This is in its own method so that the audio engine will start and stop via the AppDelegate's current state.
        startAudioEngine()

    }

    internal func startAudioEngine() {
        AudioKit.start()
        print("Audio engine started")
    }

    internal func stopAudioEngine() {
        AudioKit.stop()
        print("Audio engine stopped")
    }
}

Here's how to access the amplitude tracking data that's occurring within the Conductor singletone class from the ViewController:

import UIKit

class ViewController: UIViewController {

    var conductor = Conductor.sharedInstance

    override func viewDidLoad() {
        super.viewDidLoad()

        Timer.scheduledTimer(withTimeInterval: 0.01, repeats: true) { [unowned self] (timer) in
            print(self.conductor.tracker.amplitude)
        }

    }
}

You can download this GitHub repo from here:

https://github.com/markjeschke/AudioKit-Amplitude-Tracker

I hope this helps.

Take care,
Mark

From what I can see, it looks like it should be working, there might be something going on elsewhere in your code. I made a stripped down demo to test the basics, and it works. I just added a timer to poll the amplitude.

import UIKit
import AudioKit

class ViewController: UIViewController {

    var mic: AKMicrophone!
    var tracker: AKAmplitudeTracker!

    override func viewDidLoad() {
        super.viewDidLoad()

        mic = AKMicrophone()
        tracker = AKAmplitudeTracker(mic)
        AudioKit.output = tracker
        AudioKit.start()

        Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { (timer) in
            print(self.tracker.amplitude)
        }
    }
}
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!