audiounit

How to record and play audio simultaneously in iOS using Swift?

醉酒当歌 提交于 2019-12-21 04:00:44
问题 In Objective-C, recording and playing audio simultaneously is fairly simple. And there are tonnes of sample code on the internet. But I want to record and play audio simultaneously using Audio Unit/Core Audio in Swift. There are vary small amount of help and sample code on this using Swift. And i couldn't find any help which could show how to achieve this. I am struggling with the bellow code. let preferredIOBufferDuration = 0.005 let kInputBus = AudioUnitElement(1) let kOutputBus =

record sounds played by my iPhone app with audio units

坚强是说给别人听的谎言 提交于 2019-12-21 02:53:09
问题 I have red a lot of interesting stuff today about iOS & Audio Units and have found a lot of usefull resources (SO included). First of all , i am confused with something : Is it really necessary to create an audio graph with mixer unit to record sounds played by an app ? Or is it sufficient to play sounds with ObjectAL (or more simply AVAudioPlayer calls) and create a single remote io unit adressed on the correct bus with a recording callback ? Second , a more programmatically issue ! As i'm

Receiving kAUGraphErr_CannotDoInCurrentContext when calling AUGraphStart for playback

試著忘記壹切 提交于 2019-12-20 15:37:23
问题 I'm working with AUGraph and Audio Units API to playback and record audio in my iOS app. Now I have a rare issue when an AUGraph is unable to start with the following error: result = kAUGraphErr_CannotDoInCurrentContext (-10863) The error occurred unpredictably when we try to call AUGraphStart which is set up for audio playback: (BOOL)startRendering { if (playing) { return YES; } playing = YES; if (NO == [self setupAudioForGraph:&au_play_graph playout:YES]) { print_error("Failed to create

AudioUnit tone generator is giving me a chirp at the end of each tone generated

人走茶凉 提交于 2019-12-20 15:32:08
问题 I'm creating a old school music emulator for the old GWBasic PLAY command. To that end I have a tone generator and a music player. Between each of the notes played I'm getting a chirp sound that mucking things up. Below are both of my classes: ToneGen.h #import <Foundation/Foundation.h> @interface ToneGen : NSObject @property (nonatomic) id delegate; @property (nonatomic) double frequency; @property (nonatomic) double sampleRate; @property (nonatomic) double theta; - (void)play:(float)ms; -

AudioUnit tone generator is giving me a chirp at the end of each tone generated

这一生的挚爱 提交于 2019-12-20 15:30:17
问题 I'm creating a old school music emulator for the old GWBasic PLAY command. To that end I have a tone generator and a music player. Between each of the notes played I'm getting a chirp sound that mucking things up. Below are both of my classes: ToneGen.h #import <Foundation/Foundation.h> @interface ToneGen : NSObject @property (nonatomic) id delegate; @property (nonatomic) double frequency; @property (nonatomic) double sampleRate; @property (nonatomic) double theta; - (void)play:(float)ms; -

How do you set the input level (gain) on the built-in input (OSX Core Audio / Audio Unit)?

我的未来我决定 提交于 2019-12-20 14:17:51
问题 I've got an OSX app that records audio data using an Audio Unit. The Audio Unit's input can be set to any available source with inputs, including the built-in input. The problem is, the audio that I get from the built-in input is often clipped, whereas in a program such as Audacity (or even Quicktime) I can turn down the input level and I don't get clipping. Multiplying the sample frames by a fraction, of course, doesn't work, because I get a lower volume, but the samples themselves are still

How do you set the input level (gain) on the built-in input (OSX Core Audio / Audio Unit)?

若如初见. 提交于 2019-12-20 14:16:09
问题 I've got an OSX app that records audio data using an Audio Unit. The Audio Unit's input can be set to any available source with inputs, including the built-in input. The problem is, the audio that I get from the built-in input is often clipped, whereas in a program such as Audacity (or even Quicktime) I can turn down the input level and I don't get clipping. Multiplying the sample frames by a fraction, of course, doesn't work, because I get a lower volume, but the samples themselves are still

Using ARC for the Cocoa UI of an AudioUnit prevents NSView dealloc from being called

青春壹個敷衍的年華 提交于 2019-12-13 20:05:12
问题 I recently converted my AudioUnit plugin to take advantage of ARC for all the interface code (Cocoa). However, this resulted in the main NSView (the one created by the CocoaViewFactory and returned to the plugin as a property) never having -dealloc called. This made it impossible to dispose of the AUEventListener constructed for the NSView - which is documented to cause crashes - and creates a tonne of memory leaks (none of the Objective-C objects retained by the NSView are deallocated,

How to play pcm audio buffer from a socket server using audio unit circular buffer

旧城冷巷雨未停 提交于 2019-12-13 18:23:40
问题 I hope someone can help me. I am new to Objective-c and OSX and I am trying to play audio data I am receiving via socket into my audio queue. I found out this link https://stackoverflow.com/a/30318859/4274654 which in away address my issue with circular buffer. However when I try to run my project it returns It returns an error (OSStatus) -10865. That is why the code logs " Error enabling AudioUnit output bus". status = AudioUnitSetProperty(_audioUnit, kAudioOutputUnitProperty_EnableIO,

initialize audiounit with kAudioFormatiLBC

£可爱£侵袭症+ 提交于 2019-12-13 05:24:15
问题 i'm trying to initialize an AudioUnit to record audio using ilbc. Unfortunatly i need to use ilbc as codec and i cannot choose a different one. after reading the documentation and forums I found that the correct stream descriptor for using ilbc should be something like: streamDesc.mSampleRate = 8000.0; streamDesc.mFormatID = kAudioFormatiLBC; streamDesc.mChannelsPerFrame = 1; then I use: AudioFormatGetProperty(kAudioFormatProperty_FormatInfo, 0, NULL, &size, &streamDesc); to fill the empty