core-audio

Controlling the volume of other applications

谁都会走 提交于 2019-12-20 02:38:11
问题 I am trying to make an app that controls the volume of another process using the Windows 7 Audio API. What I'm looking for is the ISimpleAudioVolume for the session used by the other process. I have tried using the IAudioSessionEnumerator but it will only give me the IAudioSessionControl2 of the session. Using the IAudioSessionControl I have managed to receive notifications when I change the volume through sndvol but not change it myself. I have also tried using GetSimpleAudioVolume() from

OSX programmatically invoke sound level graphic

不打扰是莪最后的温柔 提交于 2019-12-19 09:57:19
问题 I have an app which can change the volume under OSX. What it lacks is the visual feedback provided when one presses the sound up/down keys. Does anyone know how to programmatically invoke that behavior? Thanks 回答1: Here's a little code from George Warner and Casey Fleser that does this trick. Think carefully that this is really the way you want to do things. // Save as sound_up.m // Compile: gcc -o sound_up sound_up.m -framework IOKit -framework Cocoa #import <Cocoa/Cocoa.h> #import <IOKit

AVAudioPlayer initialization error

拥有回忆 提交于 2019-12-19 06:18:14
问题 I initialize my AVAudioPlayer instance like: [self.audioPlayer initWithContentsOfURL:url error:&err]; url contains the path of an .m4a file The following error is displayed in the console when this line is called :" Error Domain=NSOSStatusErrorDomain Code=1685348671 "Operation could not be completed. (OSStatus error 1685348671.) " What is the reason for this error? 回答1: The error code is a four-char-code for "dta?" (you can use the Calculator app in programmer mode to convert the int values

AVAudioPlayer initialization error

烈酒焚心 提交于 2019-12-19 06:18:14
问题 I initialize my AVAudioPlayer instance like: [self.audioPlayer initWithContentsOfURL:url error:&err]; url contains the path of an .m4a file The following error is displayed in the console when this line is called :" Error Domain=NSOSStatusErrorDomain Code=1685348671 "Operation could not be completed. (OSStatus error 1685348671.) " What is the reason for this error? 回答1: The error code is a four-char-code for "dta?" (you can use the Calculator app in programmer mode to convert the int values

How do I stream AVAsset audio wirelessly form one iOS device to another?

痞子三分冷 提交于 2019-12-19 04:46:42
问题 I'm making something like streaming the audio from iPod library, send the data via network or bluetooth, and playback using audio queue. Thanks for this question and code. Help me a lot. I have two question about it. what should I send from one device to another? CMSampleBufferRef? AudioBuffer? mData? AudioQueueBuffer? packet? I have no idea. When the app finished the playing, it crashed, and I got error (-12733). I just want to know how to handle the errors instead of letting it crash.

How do you display song lyrics in Karaoke style on the iPhone?

时光毁灭记忆、已成空白 提交于 2019-12-19 04:37:10
问题 I am currently creating an app that plays music. I would like to add a feature that shows the music lyrics while the music plays, with the current position of the text marked to match the current position in the song. The Bouncing Ball effect, just like you see on every Karaoke screen as the song plays. I have been looking into extending my caf files, adding "string chunks" and then reading them out again. Is this the correct way to do this? Does anybody know of a better/easier/normal way to

iPhone: NSData representation of Audio file for Editing

人盡茶涼 提交于 2019-12-19 03:38:22
问题 I have been scratching my head since long now but not getting around to this. I haven't found a single example for Audio editing! I want to insert new Audio file in between somewhere in original Audio file, save it as new converted audio files. For this I have written following code. I got this idea from here. NSString *file1 = [[NSBundle mainBundle] pathForResource:@"file1" ofType:@"caf"]; // Using PCM format NSString *file2 = [[NSBundle mainBundle] pathForResource:@"file2" ofType:@"caf"];

iOS8 AVAudioEngine how to send microphone data over Multipeer Connectivity?

爱⌒轻易说出口 提交于 2019-12-19 03:22:29
问题 I want to send microphone audio data over Multipeer Connectivity (iOS 8) and play it through the speaker of the receiving peer. I've also setup the AVAudioEngine and I can hear the microphone data from the (upper) speaker output, but I don't know how to send AVAudioPCMBuffer over the network. Here's my code snippet: AVAudioInputNode *inputNode =[self.engine inputNode]; AVAudioMixerNode *mainMixer = [self.engine mainMixerNode]; [self.engine connect:inputNode to:mainMixer format:[inputNode

use rear microphone of iphone 5

家住魔仙堡 提交于 2019-12-18 21:23:06
问题 I have used to following code the stream the i/o of audio from microphone. What I want to do is want to select the rear microphone for recording. I have read that setting kAudioSessionProperty_Mode to kAudioSessionMode_VideoRecording can do the work but I am not sure how to use this with my code. Can any one help me in successfully setting this parameter. I have these lines for setting the property status = AudioUnitSetProperty(audioUnit, kAudioSessionProperty_Mode, kAudioSessionMode

Getting mic input and speaker output using Core Audio

假如想象 提交于 2019-12-18 18:23:10
问题 So I looked into core data recently a bit and am still a newbie. I have trouble understanding what data I am taping into and how it is effecting the overall data flow. So for some background, I have an app that does video/audio streaming between phones using webRTC. However, I want to check out the data that is being inputed into the device through my mic and the data outputted through the speaker. I looked into AurioTouch demo and Core Audio and currently I have this: - (void)setupIOUnit { /