audiounit

Core Audio Swift Equalizer adjusts all bands at once?

冷暖自知 提交于 2019-12-13 02:09:05
问题 I am having trouble setting up a kAudioUnitSubType_NBandEQ in Swift. Here is my code to initialize the EQ: var cd:AudioComponentDescription = AudioComponentDescription(componentType: OSType(kAudioUnitType_Effect),componentSubType: OSType(kAudioUnitSubType_NBandEQ),componentManufacturer: OSType(kAudioUnitManufacturer_Apple),componentFlags: 0,componentFlagsMask: 0) // Add the node to the graph status = AUGraphAddNode(graph, &cd, &MyAppNode) println(status) // Once the graph has been opened get

What does an Audio Unit Host need to do to make use of non-Apple Audio Units?

百般思念 提交于 2019-12-12 11:26:22
问题 I am writing an Objective-C++ framework which needs to host Audio Units. Everything works perfectly fine if I attempt to make use of Apple's default units like the DLS Synth and various effects. However, my application seems to be unable to find any third-party Audio Units (in /Library/Audio/Plug-Ins/Components). For example, the following code snippet... CAComponentDescription tInstrumentDesc = CAComponentDescription('aumu','dls ','appl'); AUGraphAddNode(mGraph, &tInstrumentDesc,

Setting rate on AudioUnit subtype kAudioUnitSubType_NewTimePitch

不问归期 提交于 2019-12-12 03:59:49
问题 I'm trying to get/set the rate of an AudioUnit with subtype kAudioUnitSubType_NewTimePitch . The audio unit is added to an AUGraph , through an AUNode , which has the following component description: acd->componentType = kAudioUnitType_Effect; acd->componentSubType = kAudioUnitSubType_NewTimePitch; acd->componentManufacturer = kAudioUnitManufacturer_Apple; According to AudioUnitParameters.h , get/set the rate should be as simple as get/set the rate parameter on the audio unit. // rate control

Core audio: file playback render callback function

隐身守侯 提交于 2019-12-12 03:45:48
问题 I am using RemoteIO Audio Unit for audio playback in my app with kAudioUnitProperty_ScheduledFileIDs . Audio files are in PCM format. How can I implement a render callback function for this case, so I could manually modify buffer samples? Here is my code: static AudioComponentInstance audioUnit; AudioComponentDescription desc; desc.componentType = kAudioUnitType_Output; desc.componentSubType = kAudioUnitSubType_RemoteIO; desc.componentManufacturer = kAudioUnitManufacturer_Apple; desc

ios Core audio: AudioFilePlayer Unit render callback

岁酱吖の 提交于 2019-12-12 03:38:02
问题 I am trying to create a render callback for my AudioFilePlayer Unit. I created two audio units: static AudioComponentInstance audioUnit; // AudioFilePlayer static AudioComponentInstance rioUnit; // RemoteIO Unit Audio units init code: AudioComponentDescription filePlayerDesc; filePlayerDesc.componentType = kAudioUnitType_Generator; filePlayerDesc.componentSubType = kAudioUnitSubType_AudioFilePlayer; filePlayerDesc.componentManufacturer = kAudioUnitManufacturer_Apple; filePlayerDesc

AUGraphAddNode -10862

两盒软妹~` 提交于 2019-12-12 02:54:28
问题 I want to implement a Audio Unit pattern,I/O Pass Through.My code below OSStatus result = noErr; result = NewAUGraph(&audioGraph); if (noErr != result) {[self printErrorMessage: @"NewAUGraph" withStatus: result]; return;} // 2.add AUNode AUNode inputNode; AUNode outputNode; // client format audio goes into the mixer clientFromat.SetCanonical(1, true); clientFromat.mSampleRate = kGraphSampleRate; clientFromat.Print(); CAComponentDescription input_desc(kAudioUnitType_Output,kAudioUnitSubType

can I use AVAudioPlayer to make an equalizer player?

天涯浪子 提交于 2019-12-12 01:29:01
问题 I want to make an equalizer for music player,which can do some EQ setting like bass and treble,and I want to change the music effect by setting the frequency. 250Hz , 1000Hz , 16000Hz . (void)setEQ:(@"250Hz"); (void)setEQ:(@"1000Hz z"); (void)setEQ:(@"16000Hz"); But I can not find any API of AVAudioPlayer to set frequency.Can anyone help me? I will be very grateful. 回答1: i think AVAudioplayer is not support for Equalizer Effect, for this you need to apply bands for all audio units but by

AVAudioEngine offline render: Silent output only when headphones connected

試著忘記壹切 提交于 2019-12-11 19:41:37
问题 I've been working on an app that makes an audio pipeline through AVAudioEngine and then renders to a file. I've been using this code example's approach, adapted for my own needs. The problem is that if headphones are connected to the device, the output audio file is silent. You can observe this by running that project with headphones connected. The only idea I have is that maybe iPhone usually has a mono outputNode , but headphones give it a stereo format. I find this stuff quite hard to

Audio units dynamic registration

血红的双手。 提交于 2019-12-11 16:49:41
问题 We have developed a custom audio unit and audio unit hosting application. We are trying to register the custom audio unit dynamically from the application. Below code snippet is used to register audio unit dynmaically. (this code snippet is mentioned in Apple technical note Technical Note TN2247) #include <AudioUnit/AudioComponent.h> extern AudioComponentPlugInInterface* MyExampleAUFactoryFunction(const AudioComponentDescription *inDesc); OSStatus RegisterMyExampleAudioUnit() { // fill out

Why can't I change the number of elements / buses in the input scope of AU multi channel mixer?

核能气质少年 提交于 2019-12-11 14:07:50
问题 UPDATE: I'm changing my code to illustrate the issue in a more streamlined way. Also I had a little bug which, while not deterring from the problem, did add some confusion. I'm instantiating a Multi Channel Mixer AU in iOS (kAudioUnitSubType_MultiChannelMixer) and I do the following: OSStatus status = noErr; // Set component type: AudioComponentDescription cd = {0}; cd.componentType = kAudioUnitType_Mixer; cd.componentSubType = kAudioUnitSubType_MultiChannelMixer; cd.componentManufacturer =