core-audio

audio unit fails to run in Logic; .exp _Entry undefined in linker

∥☆過路亽.° 提交于 2019-12-25 17:47:08
问题 Background I am trying to get Apple's example TremoloUnit to run in Logic 9. From various forums and this SO answer, the problem with Apple's examples seems to be that Logic 9 (and many other AU hosts) use old Carbon resources. According to this technical note, adding an appropriate .r file should provide the needed backwards compatability, so I added a .r that matches the sample .plist . The Problem If I include the line _TremoloUnitEntry in my .exp , the linker throws this error: Undefined

Converting mp3 to caf file for iPhone

跟風遠走 提交于 2019-12-25 16:00:24
问题 I am experimenting with the sampling rate of the mp3 file to convert it into caf format. afconvert -f caff -d LEI16@44100 ./fricatives_play_all.mp3 ./test.caf afconvert -f caff -d LEI16@22100 ./fricatives_play_all.mp3 ./test.caf afconvert -f caff -d LEI16@12000 ./fricatives_play_all.mp3 ./test.caf afconvert -f caff -d LEI16@20100 ./fricatives_play_all.mp3 ./test.caf afconvert -f caff -d LEI16@15100 ./fricatives_play_all.mp3 ./test.caf afconvert -f caff -d LEI16@14100 ./fricatives_play_all.mp3

Concatenating Audio Buffers in ObjectiveC

↘锁芯ラ 提交于 2019-12-25 07:04:46
问题 First of all I am new bee on c and objective c I try to fft a buffer of audio and plot the graph of it. I use audio unit callback to get audio buffer. the callback brings 512 frames but after 471 frames it brings 0. (I dont know this is normal or not. It used to bring 471 frames with full of numbers. but now somehow 512 frames with 0 after 471. Please let me know if this is normal) Anyway. I can get the buffer from the callback, apply fft and draw it . this works perfect. and here is the

OSX AudioUnit SMP

﹥>﹥吖頭↗ 提交于 2019-12-25 02:39:10
问题 I'd like to know if someone has experience in writing a HAL AudioUnit rendering callback taking benefits of multi-core processors and/or symmetric multiprocessing? My scenario is the following: A single audio component of sub-type kAudioUnitSubType_HALOutput (together with its rendering callback) takes care of additively synthesizing n sinusoid partials with independent individually varying and live-updated amplitude and phase values. In itself it is a rather straightforward brute-force

'NSInvalidArgumentException', reason: '-[AVPlayerItem duration]: unrecognized selector sent

纵饮孤独 提交于 2019-12-25 02:26:02
问题 I am trying to play a MediaItem using AVPlayer and later trying to get the duration of the current item like this (duration is an object of type CMTime): duration = [[player currentItem] duration]; I get no issues in iPad but in iPod Touch, I get the following error. I haven't tried this in iPhone yet. Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[AVPlayerItem duration]: unrecognized selector sent to instance 0x186100' Any pointers to the fix will be

my iOS app using audio units with an 8000 hertz sample rate returns a distorted voice

大兔子大兔子 提交于 2019-12-24 23:13:54
问题 I really need help with this issue. I'm developing an iOS application with audio units, the recorded audio needs to at 8bit / 8000 hertz sample rate using alaw format. How ever I'm getting a distorted voice coming out the speaker. I came across this sample online: http://www.stefanpopp.de/2011/capture-iphone-microphone/comment-page-1/ while trying to debug my app I used my audioFormat in his application and I am getting the same distorted sound. I guessing I either have incorrect settings or

Background and foreground apps using audio

瘦欲@ 提交于 2019-12-24 16:32:51
问题 I did some preliminary test and have a good idea the answer is no. But just need to confirm: Can a background and foreground app share audio playback device? (The background app will be mine. The foreground app will be from third party) 回答1: That is possible and here is how: Make sure that app continues playing audio when left in background by doing this: a) add the following to your Info plist file: "Required background modes" "Item 0" -> "App plays audio" b) Call setCategory:error: for

AudioQueue callback in simulator but not on device

耗尽温柔 提交于 2019-12-24 11:49:16
问题 I am currently working on an audio processing app on iPhone. it is based on Apple's SpeakHere sample code, aims at real-time audio processing and playback. The code works well in simulator, but not callback when tested on device. The callback function is like this: void AQPlayer::AQBufferCallback(void * inUserData, AudioQueueRef inAQ, AudioQueueBufferRef inCompleteAQBuffer) { AQPlayer *THIS = (AQPlayer *)inUserData; if (THIS->mIsDone) return; UInt32 numBytes; UInt32 nPackets = THIS-

AudioQueue fails to start

女生的网名这么多〃 提交于 2019-12-24 08:48:06
问题 I create an AudioQueue in the following steps. Create a new output with AudioQueueNewOutput Add a property listener for the kAudioQueueProperty_IsRunning property Allocate my buffers with AudioQueueAllocateBuffer Call AudioQueuePrime Call AudioQueueStart The problem is, when i call AudioQueuePrime it outputs following error on the console AudioConverterNew returned -50 Prime failed (-50); will stop (11025/0 frames) What's wrong here? PS: I got this error on iOS (Device & Simulator) The output

AudioConverterNew returned -50

丶灬走出姿态 提交于 2019-12-24 02:44:19
问题 I have a little issue regarding the use of the AudioQueue services. I have followed the guide that is available on Apple's webiste, but when I got to start and run the Audio Queue, I get the message telling me that "AudioConverterNew returned -50". Now, I know that the -50 error code means that there is a bad parameter. However, what I don't know is which parameter is the bad one (thank you so much Apple...) ! So, here's my code. Here are the parameters of my class, named cPlayerCocoa