问题
My music performance app plays audio with AVAudioEngine
, and uses inter-app audio to publish the engine's output to other apps. This allows users to feed the audio into a mixer app running on the same device. Since IAA is deprecated on iOS and not supported on Mac, I'm trying to replace this functionality with Audio Units.
I've added an audio unit extension of type augn
using the Xcode template, and I understand the internalRenderBlock
is what actually returns the audio data. But how can the extension access the audio playing in the container (main) app?
Is this even possible? I would expect this to be a common use case since Audio Units are positioned as a replacement for IAA, but I haven't seen any examples of anyone doing something like this. I don't want to process input from the host app, and I don't want to generate sound from scratch; I need to tap into the sound that the containing app is playing.
UPDATE
I just read the section "How an App Extension Communicates" in the App Extension Programming Guide. It doesn't look promising:
An app extension communicates directly only with the host app. There is no direct communication between an app extension and its containing app; typically, the containing app isn’t even running while a contained extension is running.
Also:
A Today widget (and no other app extension type) can ask the system to open its containing app by calling the openURL:completionHandler: method of the NSExtensionContext class. Any app extension and its containing app can access shared data in a privately defined shared container.
If that's the extent of the data sharing between the container and the extension, I don't see how this could work. The extension would need to access an AVAudioEngine
node in real time so if the user of the containing app changes sounds, plays, pauses, changes volume, etc. that would all be reflected in the output that the host app receives.
And yet I feel like taking away IAA if AUv3 doesn't have this capability leaves a big gap in the platform. Hopefully there's another approach I'm not thinking of.
Maybe this would need to work the other way around, so in my situation, the mixer app would offer the audio unit extension, and then my app (an audio player) would be the host and provide the audio to the mixer's extension. But then the mixer app would have the same problem of not being able to obtain the incoming audio from its extension.
回答1:
In addition to playing the audio via AVAudioEngine, an app has to also publish its audio output in an Audio Unit extension. That app extension's output can potentially be made visible to the input of other apps or Audio Unit extensions contained in other apps.
Added: To send audio data from an app to its own app extension, you can try putting the app and its extension in the same App Group, creating a set of shared files, and perhaps memory mapping the shared file(s). Or use writeToFile:atomically: to put blocks of audio samples into a ring buffer of shared files.
Also, the original pre-IAA method in iOS was to use MIDI SysEx data packets to pass audio sample blocks between apps. This might be possible on macOS as well, with a fairly low latency.
回答2:
I contacted Apple Developer Tech Support, and they recommended continuing to use IAA on iOS.
They also mentioned exchanging data between the container and extension with files in an app group, which I assumed would be unsuitable for real-time audio, but hotpaw2's answer provides a couple hints about making that work.
I did find a couple third-party alternatives for Mac:
Loopback - costs users $100, but I tested it with the free trial and it worked
Blackhole - free to users, open-source and could potentially be licensed for integration into other apps; I didn't try it
来源:https://stackoverflow.com/questions/61360345/can-an-audio-unit-v3-replace-inter-app-audio-to-send-audio-to-a-host-app