Independent Volume control of AVAudioPlayer and MPMusicPlayerController in iOS App. - ios

Within my application, I am playing downloaded audio using an AVAudioPlayer, while simultaneously playing audio from the user's iPod music library with an MPMusicPlayerController.
I need to be able to adjust the Volume of the AVAudioPlayer instance so that it's louder
than the audio coming from the MPMusicPlayerController.
The problem is, when I adjust the Volume property of the AVAudioPlayer, it also
adjusts the volume of the MPMusicPlayerController.
Is there any solution which would allow me to independently control the volume of
these two players?
If not, is there another technique I should use to do this? Any help is appreciated.

Take a look at the documentation for AVAudioSession. For example, in the AVAudioSession Programming Guide, says the following:
"Finally, you can enhance a category to automatically lower the volume of other audio when your audio is playing. This could be used, for example, in an exercise application. Say the user is exercising along to their iPod when your application wants to overlay a verbal message—for instance, “You’ve been rowing for 10 minutes.” To ensure that the message from your application is intelligible, apply the kAudioSessionProperty_OtherMixableAudioShouldDuck property to your audio session. When ducking takes place, all other audio on the device—apart from phone audio—lowers in volume."
I think it might solve your problem. The documentation on initializing an AVAudioSession and setting its categories and properties is pretty clear and easy to follow; you should have no trouble.

Related

Best way to play silence using AVAudioPlayer on iOS

I found myself in a situation where I need to simulate audio playback to trick OS controls and MPNowPlayingInfoCenter into thinking that an audio is being played. This is because I am building a player that plays multiple audio tracks, with pauses in-between creating one, continuous "audio" track. I have already everything setup inside the app itself, and the lock screen controls are working correctly but the only problem I am facing is while the actual audio stops and a pause is being "played", the lock screen info center stops the timer, and it only continues with showing correct time and overall state once another audio track starts playing.
Here is the example of my audio track built from audio files and pause items:
let items: [AudioItem] = [
.audio("part-1.mp3"),
.pause(duration: 5), // value of type: TimeInterval
.audio("part-2.mp3"),
.pause(duration: 3),
... // the list goes on
]
then in my custom player, once AVAudioPlayer finishes its job with current item, I get the next one from the array and play either a .pause with a scheduled Timer or another .audio with AVAudioPlayer.
extension Player: AVAudioPlayerDelegate {
func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
playNextItem()
}
}
And here lies the problem, once the AVAudioPlayer stops, the Now Playing info center automatically stops too, even tho I keep feeding it fresh nowPlayingInfo. Then when it hits another .audio item, it resumes correctly and shows current time, etc.
And here lies the question
how do I trick the MPNowPlayingInfoCenter into thinking that audio is being played while I "play" my .pause item?
I realise that it may still not be clear, what I am trying to achieve but I am happy to share more insight if needed. Thanks!
Some solutions I am currently thinking about:
A. Keeping 1s long empty audio track that would play on loop for as long as the pause is needed to play.
B. Creating programatically empty audio track with appropriate lenght and playing it instead of using Timer for keeping track of pause duration/progress and relying completely on AVAudioPlayer for both .audio and .pause items. Not sure this is possible though.
C. Maybe there is a way to tell the MPNowPlayingInfoCenter that the audio keeps playing without the need of using AVAudioPlayer but some API I am not familiar with?
AVAudioPlayer is probably the wrong tool here. You want AVAudioPlayerNode, which is slightly lower-level. Create an AVAudioEngine, and attach an AVAudioPlayerNode. You can then call scheduleFile(_:at:completionHandler:) to play the audio at the times you want.
Much of the Apple documentation on AVAudioEngine appears broken right this moment, but the links hopefully will be available again shortly in the links for Audio Engine Building Blocks. (If it stays down and you have trouble finding docs, leave a comment and I'll hunt down the WWDC videos and other tutorials on using AVAudioEngine. It's not particularly difficult for simple problems.)
If you know in advance how you want to compose these items (and it looks like you may), see also AVMutableComposition, which lets you glue together assets very efficiently, including adding empty segments of silence. See Media Composition and Editing for the various tools in that space.

how to implement a audio player interface?

I'm using AVAudioPlayer to play mp3 files, but I need to implement ui interface as shown following:
I think it maybe a iOS system control I can use, but I can't find which control it is.
That is an MPMoviePlayerController. Compare, for example this illustration from my book:
The same section of my book tells you how to make and work with one of these. Despite the name, it's great for playing audio with a user interface. The only difference between our screen shots is that you had an AirPlay device present on the network at the time.

AVAudioRecorder and AirPlay Mirrioring

When I have an AVAudioRecorder Session active - (when I'm recording audio) I can't activate AirPlay mirroring on the device. Airplay mirroring just deactivates while the app is running and switches it back on when the app exits. This post seems to suggest there is no way around this.
My thoughts are to try:
using a lower level recording framework
or outputting a separate window to external display, rather than mirroring (I've tried this, it doesn't work).
Is there another way around this, or do you know whether either of these methods are known to work?
Using AudioQueue to record (like Apple's Sample Code Speak Here) rather than the AVRecorder works. A bit more work to implement, but recording continues on or off Airplay mirroring.

volume slider has no effect

I am creating a radio streaming app with play, pause and volume slider.
I have implemented volume slider using MPVolumeView but unfortunately it is not working.
Can any one please let me know the correct code so that the volume slider will work in my app. I have used MPMoviePlayerController, AVPlayer, AVAudioPlayer.
Sounds as if you missed an important part of Apple's documentation;
Working with Movies and iPod Music
I am suspecting that you are using both, AVAudioPlayer and MPMoviePlayerController together and that you have setup some audio session attributes to get that working properly. Now when doing so, you may want to tell MPMoviePlayerController to use that session / or not.
Using the Media Player Framework Exclusively
If your application is using a movie player only, or a music player
only—and you are not playing your own sounds—then you should not
configure an audio session.
If you are using a movie player exclusively, you must tell it to use
its own audio session, as follows:
myMoviePlayer.useApplicationAudioSession = NO
If you are using a movie
player and a music player, then you probably want to configure how the
two interact; for this, you must configure an audio session, even
though you are not playing application audio per se. Use the guidance
in Table 6-1.

Full-featured music player using AVPlayer

I'm writing a music player for iOS that needs to have all the features of the built-in Music app. My app needs to continue running in the background so I have to use the AVPlayer class.
Are there any open source implementations out there that I can use instead of writing the whole thing myself?
Just found this. It works great:
https://github.com/gangverk/GVMusicPlayerController
If you want to play tracks from your iTunes music library, and don't want to use the MPMusicPlayerController class, your best bet is to use AVPlayer or AVQueuePlayer (subclass of AVPlayer). You must establish the appropriate audio session and register to receive remote control events for the app to continue playing music in the background.
There are downsides to this method; you won't be able to play DRM-protected tracks and audiobooks purchased from the iTunes store. There's no way to instantiate an iTunes Match download with the AVPlayer class. Furthermore, you'll have a bit of work on your hands if you want to add gapless playback and equaliser settings (The closest you'll get to gapless playback is with the AVQueuePlayer subclass, though in theory, you could overlap AVPlayers with an NSTimer).
You'll also need to change 'Required Background Modes' in your Info.plist to 'App plays audio'
As for the rest of your app, I suggest you read up on UITabBarControllers and UITableViewControllers along with MPMediaQuerys.
See this solution for the audio part.

Resources