audiotoolbox

Swift3 AudioToolbox: PCM playback how to AudioQueueAllocateBuffer?

隐身守侯 提交于 2019-12-25 16:57:48
问题 I am following https://github.com/AlesTsurko/LearningCoreAudioWithSwift2.0/tree/master/CH05_Player to playback a frequency but it is with Swift2. Get microphone input using Audio Queue in Swift 3 has resolved many of the issues but it is for recording. I am stuck at allocating a buffer to audio queue var ringBuffers = [AudioQueueBufferRef](repeating:nil, count:3) AudioQueueAllocateBuffer(inQueue!, bufferSize, &ringBuffers[0]) It gives an error main.swift:152:29: Expression type '

RoboVM app crashes on ptr.get()

僤鯓⒐⒋嵵緔 提交于 2019-12-25 12:49:54
问题 Is there some kind of magic required to get ptr.get() to work? For some reason the following code always crashes my app: AudioStreamBasicDescription asbd = new AudioStreamBasicDescription(mSampleRate, mFormatID, mFormatFlags, mBytesPerPacket, mFramesPerPacket, mBytesPerFrame, mChannelsPerFrame, mBitsPerChannel, 0); AudioFilePtr outAudioFile = new AudioFilePtr(); File f = File.createTempFile("ptt", ".caf"); AudioFileError afe = AudioFile.createWithURL(new NSURL(f), 1667327590, asbd, 1,

Play a .wav with Audio Toolbox

自作多情 提交于 2019-12-24 14:03:43
问题 I am trying to play a .wav file using Audio Toolbox. I have this so far: However my file will not play. The If condition is triggered right, however it doesn't play. It will print the error message if the file isn't there however. Any clue what is wrong? -(void) playSound : (NSString *) fName : (NSString *) ext { SystemSoundID audioEffect; NSString *path = [[NSBundle mainBundle] pathForResource : fName ofType :ext]; if ([[NSFileManager defaultManager] fileExistsAtPath : path]) { NSURL

Joining two CAF files together

荒凉一梦 提交于 2019-12-24 00:58:18
问题 I have the simple problem here. I have two CAF files. All I want to do is to join them to be one long audio file. I've tried to: Use the NSData class and append the audio data of both files into one. Hasn't worked. I assume some file property is not being set properly. The resulting file only plays the length of the first file. Tried to set the kAudioFilePropertyAudioDataByteCount property of the resulting file in my joining process. No result. Could anyone point me in the right direction /

Change the duration of AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)

你离开我真会死。 提交于 2019-12-23 03:14:38
问题 I'm using AudioServicesPlayAlertSound(kSystemSoundID_Vibrate); How can I choose the duration of the vibration? I want only one short vibration, like the "click" from tapping a keyboard button. 回答1: You can use for short vibration AudioServicesPlaySystemSound(1520); // 1521 来源: https://stackoverflow.com/questions/22176750/change-the-duration-of-audioservicesplaysystemsoundksystemsoundid-vibrate

Change the duration of AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)

久未见 提交于 2019-12-23 03:14:03
问题 I'm using AudioServicesPlayAlertSound(kSystemSoundID_Vibrate); How can I choose the duration of the vibration? I want only one short vibration, like the "click" from tapping a keyboard button. 回答1: You can use for short vibration AudioServicesPlaySystemSound(1520); // 1521 来源: https://stackoverflow.com/questions/22176750/change-the-duration-of-audioservicesplaysystemsoundksystemsoundid-vibrate

Is AudioServicesDisposeSystemSoundID required?

余生长醉 提交于 2019-12-22 11:00:51
问题 I recently started working with the AudioToolbox framework and noticed that there is a method called AudioServicesDisposeSystemSoundID() . Just to know, is it a memory leak not to call the above method when calling AudioServicesCreateSystemSoundID() to init my SystemSoundID ? I am calling it like: AudioServicesCreateSystemSoundID((CFURLRef)filePath, &sound); That way filePath being a NSURL and sound SystemSoundID . 回答1: Yes. Call it when you're done with sound . Otherwise, you may leak any

Extract meter levels from audio file

烂漫一生 提交于 2019-12-21 03:33:53
问题 I need to extract audio meter levels from a file so I can render the levels before playing the audio. I know AVAudioPlayer can get this information while playing the audio file through func averagePower(forChannel channelNumber: Int) -> Float. But in my case I would like to obtain an [Float] of meter levels beforehand. 回答1: Swift 4 It takes on an iPhone: 0.538s to process an 8MByte mp3 player with a 4min47s duration, and 44,100 sampling rate 0.170s to process an 712KByte mp3 player with a 22s

Receiving kAUGraphErr_CannotDoInCurrentContext when calling AUGraphStart for playback

試著忘記壹切 提交于 2019-12-20 15:37:23
问题 I'm working with AUGraph and Audio Units API to playback and record audio in my iOS app. Now I have a rare issue when an AUGraph is unable to start with the following error: result = kAUGraphErr_CannotDoInCurrentContext (-10863) The error occurred unpredictably when we try to call AUGraphStart which is set up for audio playback: (BOOL)startRendering { if (playing) { return YES; } playing = YES; if (NO == [self setupAudioForGraph:&au_play_graph playout:YES]) { print_error("Failed to create

Can not test / deploy in an iPhone device? I get “Command /usr/bin/codesign failed with exit code 1”

只愿长相守 提交于 2019-12-13 21:17:50
问题 Error: CodeSign /Users/edgarsifontes/Library/Developer/Xcode/DerivedData/BloggApp-fsjkfumdqbfxknenwfkgakigvosp/Build/Products/Debug-iphoneos/BloggApp.app/Frameworks/AudioToolbox.framework cd /Users/edgarsifontes/Documents/BloggApp export CODESIGN_ALLOCATE=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/codesign_allocate export PATH="/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin:/Applications/Xcode.app/Contents