how to implement a audio player interface? - ios

I'm using AVAudioPlayer to play mp3 files, but I need to implement ui interface as shown following:
I think it maybe a iOS system control I can use, but I can't find which control it is.

That is an MPMoviePlayerController. Compare, for example this illustration from my book:
The same section of my book tells you how to make and work with one of these. Despite the name, it's great for playing audio with a user interface. The only difference between our screen shots is that you had an AirPlay device present on the network at the time.

Related

Switch between audio tracks in MPMoviePlayerController

In the app I'm currently working on I have some videos with multiple audio tracks (in different languages) and I'd like to have the ability to switch between these tracks programmatically. There is a button that allows the user to switch between them manually (which presents this screen by tapping on it: http://i.stack.imgur.com/mkUTZ.png) so I assumed this wouldn't be too hard. However, after two hours of research I haven't found anything useful. There's this StackOverflow question that suggests it's impossible to play video with multiple audio streams in the first place, but apparently that doesn't stand true anymore. There are also some examples with AVFoundation (I haven't really looked into the), but none with MPMoviePlayer or MPMoviePlayerController.
So I'd like to know if my goal is achievable with stock MPMoviePlayerController and if not, what are the good alternatives to it.

AVComposition breaks on Airplay

I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!

Use iPhone's multitasking music controls in an app

I've searched the Internet for a while to try and find an answer to this. If you open up the multitasking bar and swipe to the left there are music controls that can be used by whatever app is playing music (ie Music, Pandora, etc.). I have not yet discovered a way to use these in my own music playing application. Does anyone know how to do that?
You should take a look at this example in the Apple API docs examples. It's basically what you want, a small audio player that interfaces with iTunes, lets you pick songs from the library and behaves much like the standard music player, including the music controls callbacks.

Set Airplay manually?

I have an iPad app and I have a video playing in a view. I would like to play video using Airplay but by pressing my own button.
I have set allows airplay = YES and so forth, this works if I enable the full controls, but I want to set no controls and have my own button to play the video using Airplay.
So far, I have found no information that would allow me to play a video on AppleTV without allowing the normal controls.
So just using an UIButton action to force the airplay, or at least get available devices and set it manually. Anything that would allow me to do this.
MPVolumeView will only control audio, it won't control video. For that you'd need iOS 5's AVPlayer, or a movie controller.
An alternative for you might be to use AirplayKit, a 3rd party library.
https://github.com/rothacr/AirplayKit
To answer my own question.
This is quite possible without jailbreak.
Here is apple's own page explaining this, so this will pass the review process.
Apple developer library document explaining how to do this

Independent Volume control of AVAudioPlayer and MPMusicPlayerController in iOS App.

Within my application, I am playing downloaded audio using an AVAudioPlayer, while simultaneously playing audio from the user's iPod music library with an MPMusicPlayerController.
I need to be able to adjust the Volume of the AVAudioPlayer instance so that it's louder
than the audio coming from the MPMusicPlayerController.
The problem is, when I adjust the Volume property of the AVAudioPlayer, it also
adjusts the volume of the MPMusicPlayerController.
Is there any solution which would allow me to independently control the volume of
these two players?
If not, is there another technique I should use to do this? Any help is appreciated.
Take a look at the documentation for AVAudioSession. For example, in the AVAudioSession Programming Guide, says the following:
"Finally, you can enhance a category to automatically lower the volume of other audio when your audio is playing. This could be used, for example, in an exercise application. Say the user is exercising along to their iPod when your application wants to overlay a verbal message—for instance, “You’ve been rowing for 10 minutes.” To ensure that the message from your application is intelligible, apply the kAudioSessionProperty_OtherMixableAudioShouldDuck property to your audio session. When ducking takes place, all other audio on the device—apart from phone audio—lowers in volume."
I think it might solve your problem. The documentation on initializing an AVAudioSession and setting its categories and properties is pretty clear and easy to follow; you should have no trouble.

Resources