avaudioengine

AVAudioPlayer.play() works but AVAudioPlayerNode.play() fails

扶醉桌前 提交于 2019-12-11 11:45:47
问题 I have the following Swift playground code that plays an audio file using AVAudioPlayerNode. import AVFoundation import Foundation NSSetUncaughtExceptionHandler { exception in print("Exception thrown: \(exception)") } var filePath = "/Users/fractor/Desktop/TestFile.mp3" let file : AVAudioFile do { file = try AVAudioFile(forReading: URL(fileURLWithPath: filePath)) } catch let error { print(error.localizedDescription) throw error } let audioEngine = AVAudioEngine() let playerNode =

URL is nil in AVAudioFile

心已入冬 提交于 2019-12-11 05:28:54
问题 I am trying to play multiple audio files on top of a background audio file using AVAudioEngine. When I try to initialize backgroundAudioFile , the app crashes saying :Thread 1: Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=com.apple.coreaudio.avfaudio Code=2003334207 "(null)" UserInfo={failed call=ExtAudioFileOpenURL((CFURLRef)fileURL, &_extAudioFile)}. The url passed to the initializer is valid. It is printed out. import Foundation import AVFoundation class

AVAudioPCMBuffer built programmatically, not playing back in stereo

眉间皱痕 提交于 2019-12-10 21:55:32
问题 I'm trying to fill an AVAudioPCMBuffer programmatically in Swift to build a metronome. This is the first real app I'm trying to build, so it's also my first audio app. Right now I'm experimenting with different frameworks and methods of getting the metronome looping accurately. I'm trying to build an AVAudioPCMBuffer with the length of a measure/bar so that I can use the .Loops option of the AVAudioPlayerNode's scheduleBuffer method. I start by loading my file(2 ch, 44100 Hz, Float32, non

Save audio file with changed time pitch using AVAudioEngine in Swift

旧时模样 提交于 2019-12-10 13:35:26
问题 Currently, I am trying to change the time pitch of an existing audio file from the Document folder, then override the old file with the modified one using AVAudioEngine. I have got the first part working using AVAudioPlayerNode and AVAudioUnitTimePitch. I have no idea about the second part though. Can someone point me to the right direction? Thanks. 来源: https://stackoverflow.com/questions/29012139/save-audio-file-with-changed-time-pitch-using-avaudioengine-in-swift

How to cancel or remove echo/repeated sound with AVAudioEngine?

人盡茶涼 提交于 2019-12-08 09:19:27
问题 I am using the AVAudioEngine for audio streaming. But when I speak any word into the mic, it repeats multiple times, just like echo effect. I want when I speak, it sounds only one time, not multiple times. I want to cancel the echo or extra noise. How can I achieve this? var peerAudioEngine: AVAudioEngine = AVAudioEngine() var peerAudioPlayer: AVAudioPlayerNode = AVAudioPlayerNode() var peerInput: AVAudioInputNode? var peerInputFormat: AVAudioFormat? func setUpAVPlayer() { self.peerInput =

Playing Multiple WAV out Multiple Channels AVAudioEngine

随声附和 提交于 2019-12-08 02:01:36
问题 I have 15 WAV files that I need to play back in sequence all on individual channels. I'm starting out trying to get two files working with a left / right stereo separation. I’m creating an audio engine, a mixer and two AVAudioPlayerNodes. The audio files are mono and I’m trying to get the file from PlayerA to come out the left channel and the file from PlayerB to come out the right channel. What I’m having trouble understanding is how the AudioUnitSetProperty works. It seems to relate to a

Swift 3 AVAudioEngine set microphone input format

若如初见. 提交于 2019-12-07 20:09:32
问题 I want to process the bytes read from the microphone using Swift 3 on my iOS. I currently use AVAudioEngine. print(inputNode.inputFormat(forBus: bus).settings) print(inputNode.inputFormat(forBus: bus).formatDescription) This gives me the following output: ["AVNumberOfChannelsKey": 1, "AVLinearPCMBitDepthKey": 32, "AVSampleRateKey": 16000, "AVLinearPCMIsNonInterleaved": 1, "AVLinearPCMIsBigEndianKey": 0, "AVFormatIDKey": 1819304813, "AVLinearPCMIsFloatKey": 1] <CMAudioFormatDescription

How do I create an AUAudioUnit that implements multiple audio units?

核能气质少年 提交于 2019-12-07 12:03:49
问题 In Apple's docs for creating an AUAudio Unit (Here: https://developer.apple.com/documentation/audiotoolbox/auaudiounit/1387570-initwithcomponentdescription) they claim that A single audio unit subclass may implement multiple audio units—for example, an effect that can also function as a generator, or a cluster of related effects. There are no examples of this online that I can find. Ideally it would be nice if your answer/solution involved using Swift and AVAudioEngine but I'd happily accept

AVAudioEngine inputNode's format changes when playing an AVAudioPlayerNode

我只是一个虾纸丫 提交于 2019-12-07 11:17:48
问题 I'll start with a simple "playground" view controller class I've made that demonstrates my problem: class AudioEnginePlaygroundViewController: UIViewController { private var audioEngine: AVAudioEngine! private var micTapped = false override func viewDidLoad() { super.viewDidLoad() configureAudioSession() audioEngine = AVAudioEngine() } @IBAction func toggleMicTap(_ sender: Any) { guard let mic = audioEngine.inputNode else { return } if micTapped { mic.removeTap(onBus: 0) micTapped = false

Playing Multiple WAV out Multiple Channels AVAudioEngine

ぐ巨炮叔叔 提交于 2019-12-06 10:53:08
I have 15 WAV files that I need to play back in sequence all on individual channels. I'm starting out trying to get two files working with a left / right stereo separation. I’m creating an audio engine, a mixer and two AVAudioPlayerNodes. The audio files are mono and I’m trying to get the file from PlayerA to come out the left channel and the file from PlayerB to come out the right channel. What I’m having trouble understanding is how the AudioUnitSetProperty works. It seems to relate to a single file only and seems to only be able to have one per audioUnit? I’m wondering if there is a way I