audiotoolbox

How to play pcm audio buffer from a socket server using audio unit circular buffer

旧城冷巷雨未停 提交于 2019-12-13 18:23:40
问题 I hope someone can help me. I am new to Objective-c and OSX and I am trying to play audio data I am receiving via socket into my audio queue. I found out this link https://stackoverflow.com/a/30318859/4274654 which in away address my issue with circular buffer. However when I try to run my project it returns It returns an error (OSStatus) -10865. That is why the code logs " Error enabling AudioUnit output bus". status = AudioUnitSetProperty(_audioUnit, kAudioOutputUnitProperty_EnableIO,

how to read header-chunks from a CAF-file using core-audio/audiotoolbox

对着背影说爱祢 提交于 2019-12-13 04:28:10
问题 i'm trying to read a CAF-file on OSX, using AudioToolbox's Extended Audio File API. opening the file works fine, however, i need to access the UUID chunk, and i cannot find any reference on how to do that (or how to access any header-chunk of the file) surely there must be a way to do this without parsing the file on my own. PS: i can already do this with libsndfile, but i want to find a way to do this with only components that come with OSX. i already tried calling ExtAudioFileGetProperty()

Convert Recorded sound in iphone from one format to another say wav to mp3

江枫思渺然 提交于 2019-12-12 08:15:06
问题 I am trying to record some audio and convert them to other sound formats. I am using AVAudioRecorder class to record and these are the recording settings I used.. NSDictionary *recordSetting = [[NSMutableDictionary alloc] init]; [recordSetting setValue:[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];//kAudioFormatMicrosoftGSM,kAudioFormatLinearPCM [recordSetting setValue:[NSNumber numberWithFloat:8000] forKey:AVSampleRateKey]; [recordSetting setValue:[NSNumber

Recording from Viewcontroller vs A Model Class

若如初见. 提交于 2019-12-12 05:38:33
问题 Please refer to this post for code. How can recording from the Viewcontroller (Main thread) and recording from a model class be different? I tried calling DispatchQueue.main.async {} but the audio data is always 44 bytes no matter how long i recorder which is not correct. Working Implementation : ViewControllers calls SpeechRecorder.startRecording() Desired Implementation , ViewController calls Model.tryRecording() which then results in Model calling SpeechRecorder.startRecording() Any

How do I make an array of SystemSoundIDs? Using AudioToolbox framework

人走茶凉 提交于 2019-12-12 04:08:11
问题 I'm used to creating sounds like this: NSString *explosionsoundpath = [[NSBundle mainBundle] pathForResource:@"explosion" ofType:@"caf"]; CFURLRef explosionurl = (CFURLRef ) [NSURL fileURLWithPath:explosionsoundpath]; AudioServicesCreateSystemSoundID (explosionurl, &explosion1a); AudioServicesCreateSystemSoundID (explosionurl, &explosion1b); where explosion1a and explosion1b are instance variables declared in the .h file with: SystemSoundID explosion1a; Whenever I try to make this process in

mp3 Sounds Playing on simulator but not on Device (Audio BeatBox)

旧巷老猫 提交于 2019-12-11 16:58:26
问题 I Have a really serious Problem on my end: Im Making an Audio Synchronized BEATBOX with 16 different sounds playing at the same time. In Order to Implement the TIME-SYNC of the mp3 files we gave up on the AVAudioPlayer and included the AUGraph Method of: MixerHostAudio.m and MixerHostAudio.h which I Downloaded at the Developer Resources form Apple. The Implementation worked fine for the Simulator, but not on Device, which CRASHED after loading the 4th file... we Were Loading the Files as

AudioUnit framework not found

柔情痞子 提交于 2019-12-11 06:04:30
问题 I am implementing an audio based application, in that I need to play both application audio and ipod auido. When I try to run my application I am getting an error like, ld: framework not found AudioUnit collect2: ld returned 1 exit status Command /Developer/Platforms/iPhoneSimulator.platform/Developer/usr/bin/llvm-gcc-4.2 failed with exit code 1 my view controller code is as follows, .h file: #import <UIKit/UIKit.h> #import <MediaPlayer/MediaPlayer.h> #import <AudioToolbox/AudioToolbox.h>

MusicSequenceFileLoad Returns -1 on iOS10 (AudioToolbox/MusicPlayer)

时光怂恿深爱的人放手 提交于 2019-12-11 05:54:49
问题 UPDATE 10/9/2016: I had opened up Radar #28425770 with Apple on 9/22/2016 for the following defect and they have just marked is as a duplicate (of Radar #28327056), so this appears to be a known bug within iOS10. I've encountered an error invoking the AudioToolbox / MusicPlayer API method "MusicSequenceFileLoad()" to load the contents of a MIDI file (from the given URL) into a music sequence on an iOS10 iPad Pro (wifi model) and would greatly appreciate some assistance from the community in

Setting Mac OS X volume programatically after 10.6 (Snow Leopard)

只愿长相守 提交于 2019-12-11 05:02:09
问题 Is there a way to set the Mac's System Volume using Objective-C? I tried using: AudioDeviceSetProperty([[self class]defaultOutputDeviceID], NULL, //time stamp not needed 0, //channel 0 is master channel false, //for an output device kAudioDevicePropertyVolumeScalar, sizeof(Float32), &volume); But it is deprecated after OS X 10.6 (Snow Leopard); is there a better way to do this? Or will I have to settle for application volume? 回答1: Check out https://developer.apple.com/library/mac

Switching between headphone & speaker on iPhone

ぃ、小莉子 提交于 2019-12-11 04:42:19
问题 I am trying to set up the audio routing for an iPhone app outputting. I am using a route change listener to detect when the audio route has changed. The listener detects the changes, such as when the headphones are plugged in and out. By default, the speaker plays audio and then I plug my headphones in and the audio transmits through the headphones fine. From there, any changes do not occur, even though the route change listener is detecting them. Any help would be really appreciated. NSError