avaudioengine

save the audio file in the background

懵懂的女人 提交于 2019-12-20 02:51:06
问题 I have an app that changes the pitch of the audio when the pitch button is selected, now i am using installTapOnBus to save the file but this method gets invocated after i press the pitch button therefore only part of the audio is saved, i want to save the whole audio no matter when the pitch button is selected, is there any way This method is used to play the audio -(void)playAudio { NSError *err = nil; audioEngine = [[AVAudioEngine alloc] init]; AudioFileplayer = [[AVAudioPlayerNode alloc]

AVAudioEngine.connect crash on hardware not simulator

点点圈 提交于 2019-12-20 02:27:29
问题 var engine:AVAudioEngine! var format = engine.inputNode.inputFormat(forBus: 0) engine.connect(engine.inputNode, to: engine.mainMixerNode, format: format) in function AVAudioEngine.connect make my app crash only on hardware but in simulator it's fine. When I run on hardware it's error says. Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)' terminating with uncaught exception of type

Stream data from network in AVAudioEngine, is it possible?

Deadly 提交于 2019-12-18 21:28:12
问题 I have an app in which I use AVAudioEngine for playing files from the local file system by using AVAudioPlayerNodes and AVAudioFiles . This works perfectly fine. Now I would like to enable my setup to also support streaming of MP3 files from a server on the internet. What I've tried so far My hope was that I could create some sort of buffers from NSURL objects pointing to network addresses, which I could then use with my AVAudioPlayerNode . I've searched Stack Overflow and the internet in

Spectrogram from AVAudioPCMBuffer using Accelerate framework in Swift

霸气de小男生 提交于 2019-12-18 11:33:00
问题 I'm trying to generate a spectrogram from an AVAudioPCMBuffer in Swift. I install a tap on an AVAudioMixerNode and receive a callback with the audio buffer. I'd like to convert the signal in the buffer to a [Float:Float] dictionary where the key represents the frequency and the value represents the magnitude of the audio on the corresponding frequency. I tried using Apple's Accelerate framework but the results I get seem dubious. I'm sure it's just in the way I'm converting the signal. I

Trying to stream audio from microphone to another phone via multipeer connectivity

故事扮演 提交于 2019-12-18 04:51:09
问题 I am trying to stream audio from the microphone to another iPhone via Apples Multipeer Connectivity framework. To do the audio capturing and playback I am using AVAudioEngine (much thanks to Rhythmic Fistman's answer here). I receive data from the microphone by installing a tap on the input, from this I am getting a AVAudioPCMBuffer which I then convert to an array of UInt8 which I then stream to the other phone. But when I am converting the array back to an AVAudioPCMBuffer I get an EXC_BAD

AVAudioEngine inputNode installTap crash when restarting recording

空扰寡人 提交于 2019-12-18 04:40:07
问题 I am implementing Speech Recognition in my app. When I first present the view controller with the speech recognition logic, everything works fine. However, when I try present the view controller again, I get the following crash: ERROR: [0x190bf000] >avae> AVAudioNode.mm:568: CreateRecordingTap: required condition is false: IsFormatSampleRateAndChannelCountValid(format) *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false:

AVAudioPlayerNode playing unexpectedly multiple times

£可爱£侵袭症+ 提交于 2019-12-13 03:51:12
问题 On top of a background audio file, I want to play a sequence of 8 audio files. By touching a segmented control, the user has the possibility to choose which sounds will be played on top of the background file. playCountOneAndFive(index: index) playCountOneToSeven(index: index) playCountOneToEight(index: index) The problem I am facing is that when I try to switch playing from one function to another, some of the nodes end up playing simultaneously several times. For example instead of playing

completionHandler of AVAudioPlayerNode.scheduleFile() is called too early

痞子三分冷 提交于 2019-12-12 08:06:12
问题 I am trying to use the new AVAudioEngine in iOS 8. It looks like the completionHandler of player.scheduleFile() is called before the sound file has finished playing. I am using a sound file with a length of 5s -- and the println() -Message appears round about 1 second before the end of the sound. Am I doing something wrong or do I misunderstand the idea of a completionHandler? Thanks! Here is some code: class SoundHandler { let engine:AVAudioEngine let player:AVAudioPlayerNode let mainMixer

AVAudioEngine offline render: Silent output only when headphones connected

試著忘記壹切 提交于 2019-12-11 19:41:37
问题 I've been working on an app that makes an audio pipeline through AVAudioEngine and then renders to a file. I've been using this code example's approach, adapted for my own needs. The problem is that if headphones are connected to the device, the output audio file is silent. You can observe this by running that project with headphones connected. The only idea I have is that maybe iPhone usually has a mono outputNode , but headphones give it a stereo format. I find this stuff quite hard to

AVAudioEngine Microphone Crash on Start

女生的网名这么多〃 提交于 2019-12-11 15:08:14
问题 I'm trying to set up an AudioQueue to stream audio from the microphone on an iPhone. I create my audio engine: var audioEngine = AVAudioEngine() And my audio queue: // Serial dispatch queue used to analyze incoming audio buffers. let analysisQueue = DispatchQueue(label: "com.apple.AnalysisQueue") // Install an audio tap on the audio engine's input node. audioEngine.inputNode.installTap(onBus: 0, bufferSize: 8192, // 8k buffer format: inputFormat) { buffer, time in // Analyze the current audio