audioqueueservices

How to check current time and duration in AudioQueue

喜欢而已 提交于 2019-12-03 14:41:32
问题 How to get total time duration of music in audioQueue. I am using NSTimeInterval AQPlayer::getCurrentTime() { NSTimeInterval timeInterval = 0.0; AudioQueueTimelineRef timeLine; OSStatus status = AudioQueueCreateTimeline(mQueue, &timeLine); if(status == noErr) { AudioTimeStamp timeStamp; AudioQueueGetCurrentTime(mQueue, timeLine, &timeStamp, NULL); timeInterval = timeStamp.mSampleTime; } return timeInterval; } AudioQueueGetCurrentTime(mQueue, timeLine, &timeStamp, NULL); for getting current

Why might my AudioQueueOutputCallback not be called?

人走茶凉 提交于 2019-12-03 13:46:32
问题 I'm using the Audio Queue Services API to play audio streamed from a server over a TCP socket connection on an iPhone. I can play the buffers that were filled from the socket connection, I just cannot seem to make my AudioQueue call my AudioQueueOutputCallback function, and I'm out of ideas. High level design Data is passed to the player from the socket connection, and written immediately into circular buffers in memory. As AudioQueueBuffers become available, data is copied from the circular

Playback and Recording simultaneously using Core Audio in iOS

丶灬走出姿态 提交于 2019-12-03 07:28:36
I need to play and record simultaneously using Core Audio. I really do not want to use AVFoundation API (AVAudioPlayer + AVAudioRecorder) to do this as I am making a music app and cannot have any latency issues. I've looked at the following source code from Apple: aurioTouch MixerHost I've already looked into the following posts: iOS: Sample code for simultaneous record and playback Record and play audio Simultaneously I am still not clear on how I can do playback and record the same thing simultaneously using Core Audio. Any pointers towards how I can achieve this will be greatly appreciable.

AVAssetReader to AudioQueueBuffer

白昼怎懂夜的黑 提交于 2019-12-01 11:15:33
Currently, I'm doing a little test project to see if I can get samples from an AVAssetReader to play back using an AudioQueue on iOS. I've read this: ( Play raw uncompressed sound with AudioQueue, no sound ) and this: ( How to correctly read decoded PCM samples on iOS using AVAssetReader -- currently incorrect decoding ), Which both actually did help. Before reading, I was getting no sound at all. Now, I'm getting sound, but the audio is playing SUPER fast. This is my first foray into audio programming, so any help is greatly appreciated. I initialize the reader thusly: NSDictionary *

AVAssetReader to AudioQueueBuffer

南楼画角 提交于 2019-12-01 08:42:37
问题 Currently, I'm doing a little test project to see if I can get samples from an AVAssetReader to play back using an AudioQueue on iOS. I've read this: ( Play raw uncompressed sound with AudioQueue, no sound ) and this: ( How to correctly read decoded PCM samples on iOS using AVAssetReader -- currently incorrect decoding ), Which both actually did help. Before reading, I was getting no sound at all. Now, I'm getting sound, but the audio is playing SUPER fast. This is my first foray into audio

Reading audio buffer data with AudioQueue

吃可爱长大的小学妹 提交于 2019-12-01 05:43:51
I am attempting to read audio data via AudioQueue. When I do so, I can verify that the bit depth of the file is 16-bit. But when I get the actual sample data, I'm only seeing values from -128 to 128. But I'm also seeing suspicious looking interleaved data, which makes me pretty sure that I'm just not reading the data correctly. So to begin with, I can verify that the source file is 44100, 16-bit, mono wav file. My buffer is allocated thusly: char *buffer= NULL; buffer = malloc(BUFFER_SIZE); assert(buffer); All the relevant values are set and used in: AudioFileReadPackets(inAudioFile,false,

Audio recorded using Audio Queue Services to data

╄→尐↘猪︶ㄣ 提交于 2019-11-30 14:42:44
I want to transmit voice from one iPhone to another. I have established connection between two iPhones using TCP and I have managed to record voice on the iPhone and play it using Audio Queue Services. I have also managed to send data between the two iPhones. I do this by sending NSData packages. My next step is to send the audio data to the other iPhone as it is being recorded. I believe I should do this in the AudioInputCallback . My AudioQueueBufferRef is called inBuffer and it seems that I want to convert the inBuffer->mAudioData to NSData and then send the NSData to the other device and

AudioQueueStart fail -12985

喜你入骨 提交于 2019-11-29 06:42:54
I made a streaming music player and it works fine in the foreground. But in the background iOS4, it doesn't play the next song automatically. ( remote control works ) The reason is AudioQueueStart return -12985 . I already check the audio session. it just fine. I use AudioQueueStart when it start to play the music. How can you remove AudioQueueStart ? - (void)play { [self setupAudioQueueBuffers]; // calcluate the size to use for each audio queue buffer, and calculate the // number of packets to read into each buffer OSStatus status = AudioQueueStart(self.queueObject, NULL); } I read the answer

Audio Processing: Playing with volume level

ⅰ亾dé卋堺 提交于 2019-11-29 01:42:26
问题 I want to read a sound file from application bundle, copy it, play with its maximum volume level(Gain value or peak power, I'm not sure about the technical name of it), and then write it as another file to the bundle again. I did the copying and writing part. Resulting file is identical to input file. I use AudioFileReadBytes() and AudioFileWriteBytes() functions of AudioFile services in AudioToolbox framework to do that. So, I have the input file's bytes and also its audio data format(via

Objective-c - How to serialize audio file into small packets that can be played?

℡╲_俬逩灬. 提交于 2019-11-28 19:59:10
So, I would like to get a sound file and convert it in packets, and send it to another computer. I would like that the other computer be able to play the packets as they arrive. I am using AVAudioPlayer to try to play this packets, but I couldn't find a proper way to serialize the data on the peer1 that the peer2 can play. The scenario is, peer1 has a audio file, split the audio file in many small packets, put them on a NSData and send them to peer2. Peer 2 receive the packets and play one by one, as they arrive. Does anyone have know how to do this? or even if it is possible? EDIT: Here it is