this is my code : i use this code to record the iPhone output audio by using Audio Unit then saving the output in output.caf but the output.caf file is empty any body have
The RemoteIO Audio Unit does not record audio output, only input from the microphone. The output is not connected to the mic input.
If you use RemoteIO for playing audio, you can copy the copy the same buffers you feed the output callback for playing audio to a file writer as well. But that's only for raw audio content you play via RemoteIO.
In initializaeOutputUnit you only created your audio file:
OSStatus setupErr = ExtAudioFileCreateWithURL(destinationURL, kAudioFileWAVEType, &audioFormat, NULL, kAudioFileFlags_EraseFile, &effectState.audioFileRef);
by passing 0 (frames) and NULL (audiobuffer) is just for init internal buffers:
setupErr = ExtAudioFileWriteAsync(effectState.audioFileRef, 0, NULL);
That's what's going wrong in recordingCallback:
1) ioActionFlags are always 0 and inBusNumber are always 1, because thats how you setup your callback (kInputBus = 1):
if (*ioActionFlags == kAudioUnitRenderAction_PostRender&&inBusNumber==0)
so just remove the if statement.
2) From AudioUnitRender you will receive -50 error, which is defined in CoreAudioTypes.h as an kAudio_ParamError error. This happens by bufferList is not defined and NULL!
OSStatus status; status = AudioUnitRender(THIS->mAudioUnit, ioActionFlags, inTimeStamp, kInputBus, inNumberFrames, &bufferList); if (noErr != status) { printf("AudioUnitRender error: %ld", status); return noErr; }
You just need to define an valid AudioBuffer and pass it to AudioUnitRender, this is my working RenderCallback:
static OSStatus recordingCallback (void * inRefCon, AudioUnitRenderActionFlags * ioActionFlags, const AudioTimeStamp * inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { double timeInSeconds = inTimeStamp->mSampleTime / kSampleRate; printf("\n%fs inBusNumber: %lu inNumberFrames: %lu ", timeInSeconds, inBusNumber, inNumberFrames); //printAudioUnitRenderActionFlags(ioActionFlags); AudioBufferList bufferList; SInt16 samples[inNumberFrames]; // A large enough size to not have to worry about buffer overrun memset (&samples, 0, sizeof (samples)); bufferList.mNumberBuffers = 1; bufferList.mBuffers[0].mData = samples; bufferList.mBuffers[0].mNumberChannels = 1; bufferList.mBuffers[0].mDataByteSize = inNumberFrames*sizeof(SInt16); ViewController* THIS = THIS = (__bridge ViewController *)inRefCon; OSStatus status; status = AudioUnitRender(THIS->mAudioUnit, ioActionFlags, inTimeStamp, kInputBus, inNumberFrames, &bufferList); if (noErr != status) { printf("AudioUnitRender error: %ld", status); return noErr; } // Now, we have the samples we just read sitting in buffers in bufferList ExtAudioFileWriteAsync(THIS->mAudioFileRef, inNumberFrames, &bufferList); return noErr; }
In stopRecord you should close the audio file with ExtAudioFileDispose:
- (void)stopRecording:(NSTimer*)theTimer { printf("\nstopRecording\n"); AudioOutputUnitStop(mAudioUnit); AudioUnitUninitialize(mAudioUnit); OSStatus status = ExtAudioFileDispose(mAudioFileRef); printf("OSStatus(ExtAudioFileDispose): %ld\n", status); }
Full source code: http://pastebin.com/92Fyjaye