core-audio

Converting an AudioBufferList to a CMSampleBuffer Produces Unexpected Results

ぐ巨炮叔叔 提交于 2019-12-31 00:44:25
问题 I'm trying to convert an AudioBufferList that I get from an Audio Unit into a CMSampleBuffer that I can pass into an AVAssetWriter to save audio from the microphone. This conversion works, in that the calls I'm making to perform the transformation don't fail, but recording ultimately does fail, and I'm seeing some output in the logs that seems to be cause for concern. The code I'm using looks like this: - (void)handleAudioSamples:(AudioBufferList*)samples numSamples:(UInt32)numSamples

Intermittent crash in recordingCallback() on app launch

丶灬走出姿态 提交于 2019-12-30 16:20:14
问题 My iOS app (using openFrameworks) crashes 30-40% of the time on launch on this line: if(soundInputPtr!=NULL) soundInputPtr->audioIn(tempBuffer, ioData->mBuffers[i].mDataByteSize/2, 1); which is inside the larger function in ofxiPhoneSoundStream.m static OSStatus recordingCallback(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData) { I am doing audio setup with ofSoundStreamSetup(0, 1,

could NaN be causing the occasional crash in this core audio iOS app?

孤人 提交于 2019-12-29 09:08:32
问题 My first app synthesised music audio from a sine look-up table using methods deprecated since iOS 6. I have just revised it to address warnings about AudioSession helped by this blog and the Apple guidelines on AVFoundationFramework. Audio Session warnings have now been addressed and the app produces audio as it did before. It currently runs under iOS 9. However the app occasionally crashes for no apparent reason. I checked out this SO post but it seems to deal with accessing rather than

How to make a simple EQ AudioUnit (bass, mid, treble) with iOS?

荒凉一梦 提交于 2019-12-29 03:12:47
问题 does anyone know how to make a simple EQ audio unit (3 bands - low, mid, hi) with iOS ? I know how to add an iPod EQ Audio Unit to my AU Graph. But it only give you access to presets and I need proper control of the EQ. I've looked around for some tutorials or explanations but no luck. Thanks. André 回答1: The iPhone doesn't exactly support custom AudioUnits. Or, more precisely, it doesn't allow you to register an AudioUnit's identifier so you could load it in an AUGraph. You can, however,

How to make a simple EQ AudioUnit (bass, mid, treble) with iOS?

时光总嘲笑我的痴心妄想 提交于 2019-12-29 03:12:41
问题 does anyone know how to make a simple EQ audio unit (3 bands - low, mid, hi) with iOS ? I know how to add an iPod EQ Audio Unit to my AU Graph. But it only give you access to presets and I need proper control of the EQ. I've looked around for some tutorials or explanations but no luck. Thanks. André 回答1: The iPhone doesn't exactly support custom AudioUnits. Or, more precisely, it doesn't allow you to register an AudioUnit's identifier so you could load it in an AUGraph. You can, however,

Tone Generation in Cocoa Touch

断了今生、忘了曾经 提交于 2019-12-29 03:11:26
问题 I need to generate a tone that I can manipulate frequency and wave. The overall goal is to create a basic piano. Does anyone know how I can achieve this? My development platform is the iPhone 2.x 回答1: You could always start with sin waves. :-) #include <cmath> typedef double Sample; typedef double Time; class MonoNote { protected: Time start, duration; virtual void internalRender(double now, Sample *mono) = 0; public: MonoNote(Time s, Time d) : start(s), duration(d) {} virtual ~MonoNote() {}

How to find out if an external headset is connected to an iPhone?

我的梦境 提交于 2019-12-28 18:50:27
问题 Is it possible to detect that the user has an external headset plugged into the iPhone's 3.5mm connector or the 30-pin connector? I want to output audio only to an external audio device, and keep silent if nothing is connected. 回答1: The answer is very similar to the answer to this question, but you'll want to get the kAudioSessionProperty_AudioRoute property instead. 回答2: Call this method to find out the bluetooth headset is connected or not. First import this framework #import <AVFoundation

Recording from RemoteIO: resulting .caf is pitch shifted slower + distorted

倾然丶 夕夏残阳落幕 提交于 2019-12-28 18:39:22
问题 So I've cobbled together some routines for recording audio based on some posts here. The posts I've referenced are here and here, along with reading the sites they reference. My setup: I have an existing AUGraph: (several AUSamplers -> Mixer -> RemoteIO). The AUSamplers are connected to tracks in a MusicPlayer instance. That all works fine but I want to add recording to it. Recording is working but the resulting .caf is pitch/tempo shifted slower + has bad sound quality. Must be something

How to record sound produced by mixer unit output (iOS Core Audio & Audio Graph)

☆樱花仙子☆ 提交于 2019-12-28 08:12:47
问题 I'm trying to record sound produced by a mixer unit output. For the moment, my code is based on the apple MixerHost iOS app demo : A mixer node is connected to a remote IO node on the audio graphe. And i try to set an input callback on the remote IO node input on the mixer output. I do something wrong but I can not find the error. Here is the code below. This is done just after the Multichannel Mixer unit Setup : UInt32 flag = 1; // Enable IO for playback result = AudioUnitSetProperty(iOUnit,

How to write output of AUGraph to a file?

落花浮王杯 提交于 2019-12-28 05:56:13
问题 I am trying to write (what should be) a simple app that has a bunch of audio units in sequence in an AUGraph and then writes the output to a file. I added a callback using AUGraphAddRenderNotify. Here is my callback function: OSStatus MyAURenderCallback(void *inRefCon, AudioUnitRenderActionFlags *actionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData) { if (*actionFlags & kAudioUnitRenderAction_PostRender) { ExtAudioFileRef