问题
I'm building an app that has to track input amplitude of users mic. AudioKit has a bunch of handy objects for my needs: AKAmplitudeTracker and so. I haven't found any viable info on how is it supposed to start AudioKit, begin tracking etc.
For now all code related to AudioKit initialization is in viewDidLoad method of my root VC of audio recorder module. It is not correct, because random errors occur and I can't track whats wrong. Code below shows how I use AudioKit now.
var silence: AKBooster!
var tracker: AKAmplitudeTracker!
var mic: AKMicrophone!
...
override func viewDidLoad() {
super.viewDidLoad()
switch AVAudioSession.sharedInstance().recordPermission() {
case AVAudioSessionRecordPermission.granted:
self.mic = AKMicrophone()
self.tracker = AKAmplitudeTracker(self.mic)
AKSettings.audioInputEnabled = true
AudioKit.output = self.tracker
AudioKit.start()
self.mic.start()
self.tracker.start()
break
case AVAudioSessionRecordPermission.undetermined:
AVAudioSession.sharedInstance().requestRecordPermission {
(granted) in
if granted {
self.mic = AKMicrophone()
self.tracker = AKAmplitudeTracker(self.mic)
AKSettings.audioInputEnabled = true
AudioKit.output = self.tracker
AudioKit.start()
self.mic.start()
self.tracker.start()
}
}
case AVAudioSessionRecordPermission.denied:
AVAudioSession.sharedInstance().requestRecordPermission {
(granted) in
if granted {
self.mic = AKMicrophone()
self.tracker = AKAmplitudeTracker(self.mic)
AKSettings.audioInputEnabled = true
AudioKit.output = self.tracker
AudioKit.start()
self.mic.start()
self.tracker.start()
}
}
default:
print("")
}
...
}
Please help me figure out how to correctly manage AudioKit.
回答1:
Alexey,
My recommendation for managing AudioKit's lifecycle is to house it within a singleton class. This is how it's set up in some of the AudioKit examples included in the repo, such as Analog Synth X and Drums. That way, it's not bound to a specific ViewController's viewDidLoad
and can be accessed from multiple ViewControllers or the AppDelegate that manages the app's state. It also ensures that you will only create one instance of it.
Here's an example where AudioKit is initialized within a class called, Conductor
(could also be called AudioManager
, etc):
import AudioKit
import AudioKitUI
// Treat the conductor like a manager for the audio engine.
class Conductor {
// Singleton of the Conductor class to avoid multiple instances of the audio engine
static let sharedInstance = Conductor()
// Create instance variables
var mic: AKMicrophone!
var tracker: AKAmplitudeTracker!
// Add effects
var delay: AKDelay!
var reverb: AKCostelloReverb!
// Balance between the delay and reverb mix.
var reverbAmountMixer = AKDryWetMixer()
init() {
// Allow audio to play while the iOS device is muted.
AKSettings.playbackWhileMuted = true
AKSettings.defaultToSpeaker = true
// Capture mic input
mic = AKMicrophone()
// Pull mic output into the tracker node.
tracker = AKAmplitudeTracker(mic)
// Pull the tracker output into the delay effect node.
delay = AKDelay(tracker)
delay.time = 2.0
delay.feedback = 0.1
delay.dryWetMix = 0.5
// Pull the delay output into the reverb effect node.
reverb = AKCostelloReverb(delay)
reverb.presetShortTailCostelloReverb()
// Mix the amount of reverb to the delay output node.
reverbAmountMixer = AKDryWetMixer(delay, reverb, balance: 0.8)
// Assign the reverbAmountMixer output to be the final audio output
AudioKit.output = reverbAmountMixer
// Start the AudioKit engine
// This is in its own method so that the audio engine will start and stop via the AppDelegate's current state.
startAudioEngine()
}
internal func startAudioEngine() {
AudioKit.start()
print("Audio engine started")
}
internal func stopAudioEngine() {
AudioKit.stop()
print("Audio engine stopped")
}
}
Here's how to access the amplitude tracking data that's occurring within the Conductor
singletone class from the ViewController:
import UIKit
class ViewController: UIViewController {
var conductor = Conductor.sharedInstance
override func viewDidLoad() {
super.viewDidLoad()
Timer.scheduledTimer(withTimeInterval: 0.01, repeats: true) { [unowned self] (timer) in
print(self.conductor.tracker.amplitude)
}
}
}
You can download this GitHub repo from here:
https://github.com/markjeschke/AudioKit-Amplitude-Tracker
I hope this helps.
Take care,
Mark
回答2:
From what I can see, it looks like it should be working, there might be something going on elsewhere in your code. I made a stripped down demo to test the basics, and it works. I just added a timer to poll the amplitude.
import UIKit
import AudioKit
class ViewController: UIViewController {
var mic: AKMicrophone!
var tracker: AKAmplitudeTracker!
override func viewDidLoad() {
super.viewDidLoad()
mic = AKMicrophone()
tracker = AKAmplitudeTracker(mic)
AudioKit.output = tracker
AudioKit.start()
Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { (timer) in
print(self.tracker.amplitude)
}
}
}
来源:https://stackoverflow.com/questions/44258954/what-is-a-correct-way-to-manage-audiokits-lifecycle