audiokit

Background mode is not enabled in iOS tests

余生颓废 提交于 2020-01-14 15:59:13
问题 I'm writing a Swift library which in turn uses the AudioKit library as a dependency. When using AudioKit in iOS, you'll have to enable 'Background Mode' in the capabilities section in project settings. But when running iOS tests, there's no place for such configuration and as a result you're confronted with this error: CheckError Error: kMIDINotPermitted: Have you enabled the audio background mode > in your ios app? I tried adding the corresponding "Required background modes" entry to the

Audiokit seems to receive only the first three numbers of sysex MIDI messages

喜你入骨 提交于 2020-01-14 05:00:13
问题 I'm trying to use audiokit to receive syses messages from a hardware synthesizer in an app on the mac. These synthesizer message are build up of 11 numbers, for example: 240,00,32,51,01,16,112,00,40,95,247 in the current released version of audiokit (4.5.5) messages like this are received in a AKMIDIListener class in the function: receivedMIDISystemCommand(data) . The "data"-object her receives "messed up" messages like the following: [240, 0, 32, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0,

AudioKit crash: required condition is false: !destNodeMixerConns.empty() && !isDestNodeConnectedToIONode

梦想与她 提交于 2020-01-11 09:06:13
问题 We are experiencing an exception in our project: 2019-08-08 10:18:28.703708-0600 AppName[99385:5069475] *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: !destNodeMixerConns.empty() && !isDestNodeConnectedToIONode' *** First throw call stack: ( 0 CoreFoundation 0x000000010ee2e8db __exceptionPreprocess + 331 1 libobjc.A.dylib 0x000000010e21bac5 objc_exception_throw + 48 2 CoreFoundation 0x000000010ee2e662 +[NSException raise

AudioKit export file with Filters

Deadly 提交于 2020-01-05 05:46:08
问题 i want to override an existing .m4a file with many filters with audio kit, my code : file = try AKAudioFile(forReading: recordVoiceURL) player = AKPlayer(audioFile: file) delay = AKVariableDelay(player) delay.rampTime = 0.5 delayMixer = AKDryWetMixer(player, delay) reverb = AKCostelloReverb(delayMixer) reverbMixer = AKDryWetMixer(delayMixer, reverb) booster = AKBooster(reverbMixer) tracker = AKAmplitudeTracker(booster) AudioKit.output = tracker try AudioKit.start() i am changing the values of

AVAudioSession .defaultToSpeaker changes mic input

徘徊边缘 提交于 2019-12-28 06:33:08
问题 I have an app taps the microphone and also play sounds depending on mic input(don't have to be simultaneously tho) This code below works. But one problem is the output plays on the small top speaker and not the bottom real loud speakers. I could solve this problem strangely by putting the 3 lines below just before the player starts , Then I can hear the sound on speakers. But then the microphone stops listening ! Even after the player stops playing. Basically mic does not like when it is

Get dB(a) level from AudioKit in swift

百般思念 提交于 2019-12-25 02:47:15
问题 I am attempting to get dB(A) readings from audio recorded from the microphone from AudioKit. I attempt to pass the amplitude tracker into the AKFFTTap object, but it always returns an array of zeros when I call AKFFTTap.fftData . Does anyone have experience with getting dB(A) values from AudioKit and the microphone. 来源: https://stackoverflow.com/questions/48813775/get-dba-level-from-audiokit-in-swift

Receiving Sysex messages with audiokit

橙三吉。 提交于 2019-12-24 17:07:24
问题 I have an app which is sending controller settings to a hardware synthesizer using sysex. In other words: such a syses messages selects a parameter from the synth, and sets its value. With audiokit this is pretty simple. Such a message looks like this: [240, 00, 32, 51, 1, 16, 112, 00, 40, 95, 247] Which sets parameter 40 (in parameter group 112) to 95 00, 32, 51, 1 defines the synth model, other the part number and channel. Now I try to build the opposite: the synth sends its parameters and

Audiokit crashes when enabling sandbox in OS X

倾然丶 夕夏残阳落幕 提交于 2019-12-24 10:08:07
问题 My app using Audiokit is running perfectly without sandbox. But as soon as I enable sandbox in Xcode the app crashes when trying to initialize the mic access. (The app is on the App Store for iOS, but now I am trying to submit to the Mac Store as a OS X app but I need to enable sandbox) Has anyone been able to submit a Mac app to the Mac App Store with Audiokit in it? ERROR: >avae> AVAudioEngine.mm:275: AttachNode: required condition is false: node != nil 回答1: To enable microphone access in

Offline audio render with AudioKit for iOS < 11

。_饼干妹妹 提交于 2019-12-24 04:03:33
问题 I have 4 AKPlayer nodes and each one is connected to some effects and finally they are mixed together. I want to render offline the output for iOS > 9.0 but I can't figure out how. edit : I have implemented the render and separated it for iOS >11 While iOS>11 renderToFile seems to do well, but for iOS <11 the rendered file has some lags and jumps forward at some seconds, resulting silent in the end. here is my render function : do{ if #available(iOS 11, *) { let outputFile = try AKAudioFile

Creating a MIDI file from an AKKeyboardView

不羁岁月 提交于 2019-12-23 17:10:55
问题 Currently I am using an AKKeyboardView to connect essentially to the AKRhodesPiano object, and I was wondering if there was an easy way to generate a MIDI file from this? I see the AKKeyboardView has the noteOn and noteOff functions, which does produce the MIDINoteNumber but I can't find anywhere else in the AudioKit library to really take this input and generate a MIDI file, even if only just a simple one. 回答1: You would need to run an AKSequencer in the background (maybe with a metronome