My music performance app plays audio with AVAudioEngine, and uses inter-app audio to publish the engine's output to other apps. This allows users to feed the audio into a mixer app running on the same device. Since IAA is deprecated on iOS and not supported on Mac, I'm trying to replace this functionality with Audio Units.
I've added an audio unit extension of type augn using the Xcode template, and I understand the internalRenderBlock is what actually returns the audio data. But how can the extension access the audio playing in the container (main) app?
Is this even possible? I would expect this to be a common use case since Audio Units are positioned as a replacement for IAA, but I haven't seen any examples of anyone doing something like this. I don't want to process input from the host app, and I don't want to generate sound from scratch; I need to tap into the sound that the containing app is playing.
UPDATE
I just read the section "How an App Extension Communicates" in the App Extension Programming Guide. It doesn't look promising:
An app extension communicates directly only with the host app. There is no direct communication between an app extension and its containing app; typically, the containing app isn’t even running while a contained extension is running.
Also:
A Today widget (and no other app extension type) can ask the system to open its containing app by calling the openURL:completionHandler: method of the NSExtensionContext class. Any app extension and its containing app can access shared data in a privately defined shared container.
If that's the extent of the data sharing between the container and the extension, I don't see how this could work. The extension would need to access an AVAudioEngine node in real time so if the user of the containing app changes sounds, plays, pauses, changes volume, etc. that would all be reflected in the output that the host app receives.
And yet I feel like taking away IAA if AUv3 doesn't have this capability leaves a big gap in the platform. Hopefully there's another approach I'm not thinking of.
Maybe this would need to work the other way around, so in my situation, the mixer app would offer the audio unit extension, and then my app (an audio player) would be the host and provide the audio to the mixer's extension. But then the mixer app would have the same problem of not being able to obtain the incoming audio from its extension.
In addition to playing the audio via AVAudioEngine, an app has to also publish its audio output in an Audio Unit extension. That app extension's output can potentially be made visible to the input of other apps or Audio Unit extensions contained in other apps.
Added: To send audio data from an app to its own app extension, you can try putting the app and its extension in the same App Group, creating a set of shared files, and perhaps memory mapping the shared file(s). Or use writeToFile:atomically: to put blocks of audio samples into a ring buffer of shared files.
Also, the original pre-IAA method in iOS was to use MIDI SysEx data packets to pass audio sample blocks between apps. This might be possible on macOS as well, with a fairly low latency.
I contacted Apple Developer Tech Support, and they recommended continuing to use IAA on iOS.
They also mentioned exchanging data between the container and extension with files in an app group, which I assumed would be unsuitable for real-time audio, but hotpaw2's answer provides a couple hints about making that work.
I did find a couple third-party alternatives for Mac:
Loopback - costs users $100, but I tested it with the free trial and it worked
Blackhole - free to users, open-source and could potentially be licensed for integration into other apps; I didn't try it
I read Similar questions like this question
I have an app that should access to recorded sound (with voice Memos or other Recording apps) - I don't want to record sound in my app - I just want to access the recorded sound - I know that there are some limits in Ios for doing this But I think It is possible please help me
Remember That I read Similar questions But My question is different because I don't want to record sounds in my app
Why don`t you try to use AVAudioPlayer of AVFoundationFramework?
You could also try some third-party framework like SwiftySound
https://github.com/adamcichy/SwiftySound
Just to mention: there is also a plenty of system sounds which is sometimes quite useful.
You can see the list at the following link
https://github.com/TUNER88/iOSSystemSoundsLibrary
You can play them by implementing AudioToolbox framework in your project.
Add this into a file (if you are using Objective-c)
#import <AudioToolbox/AudioToolbox.h>
Then run AudioServicesPlaySystemSound(PUT_SOUND_ID_HERE); somewhere.
I'm trying to play a video on the apple watch using WKInterfaceMovie. I want it to be a video on a remote server.
In a similar question a solution is given and I was able to reproduce this for local files. It's also said in the solution, that it only works for local files and I'm wondering why that is.
Are there any sources in the documentation or is it the general experience?
I was beginning to wonder, because it is said in the transition guide under The Movie Object (WKInterfaceMovie) that:
"The URL you specify for your media assets may refer to a local file or an asset located on a remote server. For remote assets, the movie object downloads the movie completely before playing it."
So I think it should be possible by that documentation even though I wasn't be able to implement it successfully, yet.
I'm making an iOS app in which I'd like to allow the user to save an audio file (a specific file that the app uses internally, not just any arbitrary audio file) to their music library so they can play it from other apps on the device. Ideally I'd like to save a sound directly to the users music library, but it seems from other similar questions that this is not possible. File sharing with iTunes seems to be the next best solution.
Is there anything about using the iTunes file sharing option for saving audio in this way that violates the app store terms?
Is this the path of least friction for the user, or is there another way to achieve this that I'm missing?
Is there is a way to invoke a media player in BlackBerry ?
If so, can we also pass a url to the player asking it to open and stream a remote url file?
Are you talking about invoking the standard media application or just embed player in your app?
1)If you want to embed player check this - http://supportforums.blackberry.com/t5/Java-Development/Playing-Audio-in-Your-Application/ta-p/446826
2)If you need to start the standart bb player you can use the Content Handler API(CHAPI). Look around and you will find many tutorials. Basically it works like this: you have a .mp3 file for example and you make request to the blackberry os asking "can anyone handle .mp3 extension" , and if there is an app registerred to handle mp3 you can invoke it automatically and i handles tha fore/background transitions between your app and the mp3 handling app.