audiounit

iOS: How to resample audio(PCM data) using Audio Unit at runtime?

五迷三道 提交于 2020-07-02 07:45:08
问题 How can i resample audio(PCM data) using Audio Unit at runtime/live ? I have an Audio Unit setup as follows. - (void) setUpAudioUnit { OSStatus status; AudioComponentInstance audioUnit; AudioComponent inputComponent; AudioComponentDescription audioComponentDescription; AudioStreamBasicDescription audioStreamBasicDescription; // Describe audio component audioComponentDescription.componentType = kAudioUnitType_Output; audioComponentDescription.componentSubType = kAudioUnitSubType

Can an audio unit (v3) replace inter-app audio to send audio to a host app?

浪子不回头ぞ 提交于 2020-06-17 07:36:50
问题 My music performance app plays audio with AVAudioEngine , and uses inter-app audio to publish the engine's output to other apps. This allows users to feed the audio into a mixer app running on the same device. Since IAA is deprecated on iOS and not supported on Mac, I'm trying to replace this functionality with Audio Units. I've added an audio unit extension of type augn using the Xcode template, and I understand the internalRenderBlock is what actually returns the audio data. But how can the

Connection of varispeed with RemoteIO in iOS

邮差的信 提交于 2020-05-14 03:55:06
问题 I am working with audio units to play and change the speed of playback. Since AudioGraph is deprecated. What I have done, I have successfully played Audio coming from UDP via audio-units and made connections like: converterUnit -> varispeed -> outConverterUnit -> RemoteIO (Out) Our format for playing is int16(PCM) , but varispeed requires float datatype, So we are using converters for varispeed . Here is my code: var ioFormat = CAStreamBasicDescription( sampleRate: 48000.0, numChannels: 1,

AudioUnit inputCallback with AudioUnitRender -> mismatch between audioBufferList.mBuffers[0].mDataByteSize != inNumberFrames

一个人想着一个人 提交于 2020-03-05 04:11:25
问题 We are using the AudioUnits input callback to process the incoming buffer. The audio unit setup is taken mostly from https://github.com/robovm/apple-ios-samples/blob/master/aurioTouch/Classes/AudioController.mm I have added some sanity check in the audio callback. It looks like this /// The audio input callback static OSStatus audioInputCallback(void __unused *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 __unused inBusNumber, UInt32

Control mono playback output with Core Audio

自闭症网瘾萝莉.ら 提交于 2020-03-03 06:18:12
问题 I'm developing an application for iOS, that uses the RemoteIO audio unit to record audio from the microphone, process it and output to the speakers (headset). Currently I use a single channel (mono) for input and output. What I'd like to do, is to allow the users to choose an output speaker: left-only, right-only or both. My current code supports only the "both" setting - the same sound is coming from both speakers. Here's how I set the stream format (kAudioUnitProperty_StreamFormat) of the

Clipping sound with opus on Android, sent from IOS

泪湿孤枕 提交于 2020-01-23 12:09:43
问题 I am recording audio in IOS from audioUnit, encoding the bytes with opus and sending it via UDP to android side. The problem is that the sound is playing a bit clipped . I have also tested the sound by sending the Raw data from IOS to Android and it plays perfect. My AudioSession code is try audioSession.setCategory(.playAndRecord, mode: .voiceChat, options: [.defaultToSpeaker]) try audioSession.setPreferredIOBufferDuration(0.02) try audioSession.setActive(true) My recording callBack code is:

Playing Audio on iOS from Socket connection

孤人 提交于 2020-01-23 03:44:06
问题 Hope you can help me with this issue, I have seen a lot of questions related to this, but none of them really helps me to figure out what I am doing wrong here. So on Android I have an AudioRecord which is recording audio and sending the audio as byte array over a socket connection to clients. This part was super easy on Android and is working perfectly. When I started working with iOS I found out there is no easy way to go about this, so after 2 days of research and plugging and playing this

What's the reason of using Circular Buffer in iOS Audio Calling APP?

混江龙づ霸主 提交于 2020-01-11 03:30:06
问题 My question is pretty much self explanatory. Sorry if it seems too dumb. I am writing a iOS VoIP dialer and have checked some open-source code(iOS audio calling app). And almost all of those use Circular Buffer for storing recorded and received PCM audio data. SO i am wondering why we need to use a Circular Buffer in this case. What's the exact reason for using such audio buffer. Thanks in advance. 回答1: Good question. There is another good reason for using Circular Buffer. In iOS, if you use

memory is growing in audio buffer code

徘徊边缘 提交于 2020-01-06 05:44:09
问题 I have a code that we use many times with our apps, its a class that take the buffer samples and process it ,then send back notification to the main class. The code is c and objective-c. It works just great, but there is a memory growing which i can see in instruments-allocations tool. the "overall bytes" is keep growing, in 100k a second. becuase of some parts of the code that i know who they are . this is the callback function, with the line that makes problems. it happens many times a

AudioUnitRender error -50 with odd length buffers

ⅰ亾dé卋堺 提交于 2020-01-05 03:29:08
问题 I have a RemoteIO unit configured with AVAudioSessionCategoryPlayAndRecord. I find some strange behavior in it. I open the app, and immediately close it before audioUnit initializes fully (it actually initializes in background as I quit the app too soon). Next I bring the app to foreground and immediately on relaunch, I see AudioUnitRender failing with error -50 continuously. I find inNumberFrames to be 1115 and it fails whenever this number is odd. func recordingCallback(inRefCon