I have 15 WAV files that I need to play back in sequence all on individual channels. I'm starting out trying to get two files working with a left / right stereo separation.
I’m creating an audio engine, a mixer and two AVAudioPlayerNodes. The audio files are mono and I’m trying to get the file from PlayerA to come out the left channel and the file from PlayerB to come out the right channel. What I’m having trouble understanding is how the AudioUnitSetProperty works. It seems to relate to a single file only and seems to only be able to have one per audioUnit? I’m wondering if there is a way I can associate a file with an audioUnit? I can’t seem to return the audioUnit object associated with each track.
func testCode(){
// get output hardware format
let output = engine.outputNode
let outputHWFormat = output.outputFormat(forBus: 0)
// connect mixer to output
let mixer = engine.mainMixerNode
engine.connect(mixer, to: output, format: outputHWFormat)
//then work on the player end by first attaching the player to the engine
engine.attach(playerA)
engine.attach(playerB)
//find the audiofile
guard let audioFileURLA = Bundle.main.url(forResource: "test", withExtension: "wav") else {
fatalError("audio file is not in bundle.")
}
guard let audioFileURLB = Bundle.main.url(forResource: "test2", withExtension: "wav") else {
fatalError("audio file is not in bundle.")
}
var songFileA:AVAudioFile?
do {
songFileA = try AVAudioFile(forReading: audioFileURLA)
print(songFileA!.processingFormat)
// connect player to mixer
engine.connect(playerA, to: mixer, format: songFileA!.processingFormat)
} catch {
fatalError("canot create AVAudioFile \(error)")
}
let channelMap: [Int32] = [0, -1] //play channel in left
let propSize: UInt32 = UInt32(channelMap.count) * UInt32(MemoryLayout<sint32>.size)
print(propSize)
let code: OSStatus = AudioUnitSetProperty((engine.inputNode?.audioUnit)!,
kAudioOutputUnitProperty_ChannelMap,
kAudioUnitScope_Global,
1,
channelMap,
propSize);
print(code)
let channelMapB: [Int32] = [-1, 0] //play channel in left
var songFileB:AVAudioFile?
do {
songFileB = try AVAudioFile(forReading: audioFileURLB)
print(songFileB!.processingFormat)
// connect player to mixer
engine.connect(playerB, to: mixer, format: songFileB!.processingFormat)
} catch {
fatalError("canot create AVAudioFile \(error)")
}
let codeB: OSStatus = AudioUnitSetProperty((engine.inputNode?.audioUnit)!,
kAudioOutputUnitProperty_ChannelMap,
kAudioUnitScope_Global,
1,
channelMapB,
propSize);
print(codeB)
do {
try engine.start()
} catch {
fatalError("Could not start engine. error: \(error).")
}
playerA.scheduleFile(songFileA!, at: nil) {
print("done")
self.playerA.play()
}
playerB.scheduleFile(songFileA!, at: nil) {
print("done")
self.playerB.play()
}
playerA.play()
playerB.play()
print(playerA.isPlaying)
}
engine.connect(mixer, to: output, format: outputHWFormat)
This isn't necessary, the mixer will be implicitly connected when accessed.
As for panning: AudioUnitSetProperty also isn't necessary. AVAudioPlayerNode conforms to AVAudioMixing, so since there is a mixer node downstream from the player, all you have to do is this:
playerA.pan = -1
playerB.pan = 1
来源:https://stackoverflow.com/questions/46041563/playing-multiple-wav-out-multiple-channels-avaudioengine