I'm having a streaming URL which is something like "http://myserver.com/master.m3u8". (this is dummy URL)
This URL is playing fine in the safari browser on iPhone.
But when playing the same within the app using following code, i'm facing some issues:
NSURL* theURL = [NSURL URLWithString:#"http://myserver.com/master.m3u8"];
MPMoviePlayerViewController* moviePlayerViewController = [[MPMoviePlayerViewController alloc] initWithContentURL:theURL];
moviePlayerViewController.moviePlayer.movieSourceType = MPMovieSourceTypeStreaming;
[self presentMoviePlayerViewControllerAnimated:moviePlayerViewController];
The problem when playing within the app is that, after sometime the screen turns to black color. But i'm still able to hear the audio.
How can I debug where is the issue.
Can some one help me who faced similar issue please?
If you create a standard m3u8 file the lowest version of the video will include an audio only version of the stream. So if the bandwidth is to low the player may switch to this stream and play audio only.
I haven't found a solution yet to do something meaningful in the app when this happens(for example pause the video and wait until the bandwidth is sufficient to play the next higher version of the stream which has video again) but if you can tweak the m3u8 or the encoding process you could just remove the audio only version from your m3u8. Then the player would switch to the lowest video stream and pause if the bandwidth isn't enough to show it.
Please keep in mind that you have to provide this to the App Review team when submitting the app to the store. This is mentioned in this Technical QA from Apple: Resolving App Store Approval Issues for HTTP Live Streaming
Note: As the baseline 64 kbps maximum audio-only HTTP Live stream requirement is specifically for streaming over a cellular network, if your application is self-restricting to Wi-Fi only HTTP Live Streaming and you choose to not supply a baseline 64 kbps audio-only stream, you must provide this information to the App Review team. Developers can include this information in the Review Notes field for your application.
Related
I'm building an app that is playing different streaming videos. The file that I'm playing in my AVPlayer object is an MP4 file.
Reading through the App Store Review Guidelines I just noticed that the rule 2.5.7 says:
Video streaming content over a cellular network longer than 10 minutes must use HTTP Live Streaming and include a baseline 192 kbps HTTP Live stream.
What does this mean exactly? Can I stream an MP4 video which is longer than 10 minutes?
If your MP4 video is less than 10 minutes then presumably you can just put it on a server somewhere and have the player download the file (progressive download) - you don't need to use a streaming protocol like HLS. However, if your video is more than 10 minutes then you must use HLS. This means segmenting your video into chunks and creating a playlist for them. You can do this with Apple's streaming tools - such as mediafilesegmenter - or you can use ffmpeg to segment your videos.
That guideline is for cellular networks only, so it doesn't apply if the user is connected via wifi. Take a look at Apple's recommendations for encoding your video(s) for HLS.
It explicitly says you CAN play a video longer than 10 minutes. However, the guidelines say your app will be rejected if it doesn't meet the requirements stated.
Admittedly I've never attempted a 10 minute video playback, but from the documentation it seems to imply your media must allow 192 kbps playback (presumably for cellular data plans) and also must conform to the HTTP Live Stream protocol.
Here is the technical documentation Apple provides on HTTP Live streaming
https://developer.apple.com/library/ios/technotes/tn2224/_index.html
Best of luck! Please let me know if I can help with anything more specific :)
My app needs to play some music files, like .mp3. I would like to use MPMoviePlayerController because it has implemented all the UI stuff for me, i.e. I do not want to bother implementing progress slide bar and things like this.
I tested to use it to play a .mp3 file and it worked fine but I do not know if it is fine to use it to do this because its name says "movie player" and it seems it is supposed to play a movie. Would apple reject this? Thank you.
For playing audio from a file or memory, AVAudioPlayer is your best option but unfortunately it doesn't support a network stream while MPMoviePlayerController can
From documentation :
An instance of the AVAudioPlayer class, called an audio player,
provides playback of audio data from a file or memory.
Apple recommends that you use this class for audio playback unless you
are playing audio captured from a network stream or require very low
I/O latency.
For the Apple validation I don't think that your application can be rejected because you're using the Media Player Framework to play an audio file. In fact here they explicitly say that you can do just that:
Choose the right technology for your needs:
To play the audio items in a user’s iPod library, or to play local or
streamed movies, use the Media Player framework. Classes in this
framework automatically support sending audio and video to AirPlay
devices such as Apple TV.
Not sure about performance and memory issues though!
Best of luck.
I'm using AVPlayer to play youtube videos, for each youtube video id I retrieve a couple of stream urls in different qualities.
I want to play a particular stream quality according to the network state. For example if user is on 3G I want to play the lowest quality URL but if user moves to wifi I want to seamlessly switch to the better quality stream.
This is nothing new, youtube is doing that in their app and many others.
So I wonder what is the best way to do this kind of switching with AVPlayer, I don't want the user to notice the switching as possible, without pausing the video playback or buffering.
Any advices?
I'm not sure if this kind of functionality is supported on the youtube servers or if I need to do it on client side.
You should have a look at the Apple documentation on HTTP live streaming.
The only way to achieve the type of switching that you want and is talked about in the documentation is the use of m3u index files and TS files containing the video data.
You connect to the index file and store its contents, which will be multiple URL's along with bandwidth requirements. See the examples here. Then use the
Reachability class to check network status and connect to the appropriate stream. Start the Reachability notifier and react to events by changing the stream you're connected to. This will cause the TS file that belongs to the stream to be downloaded and buffered for playback, achieving the type of switching you want.
As I previously said, the drawback is the requirement to use TS files. This would mean you video files would have to be downloaded from Youtube, prepared using the Apple provided mediafilesegmenter command line tool and the stored on an FTP server! Not ideal at all but as far as I'm aware the only way to do this.
Check out the AVPlayer replaceCurrentItemWithPlayerItem: method. If I were you, I would use Reachbility to observe the user's network status. When the network reachability degrades, you can do something like this:
AVPlayerItem *item = [AVPlayerItem playerItemWithURL:urlOfLowerQuality];
[item seekToTime:player.currentTime];
[player replaceCurrentItemWithPlayerItem:item];
Use ReachabilityManager to check current status of data type either wifi or 3G. According to data mode switch url type.While Switching url take current time of video and need to set seek time of video.
I'm working with Apple's HTTP Live Streaming protocol which, when submitting apps to the App Store, required that there's an audio only stream as part of the multiplex. As a result of this the first segment (10 seconds) is always audio only and the image below is shown instead of the beginning of the video stream, regardless of the amount of bandwidth that's actually available.
I know I can show a static image instead but I'm wondering if there's a way to delay the stream starting until it's determined if there's enough bandwidth to go straight to the video stream.
The order of your bitrates in the manifest file is key, as the device try to play the bitrates segments in order. We recommend listing the bitrates from highest to lowest to avoid starting the video off with the worst bitrate and then switching only when iOS had detected sufficient bandwidth.
In my app I use MPMoviePlayerController to play an mp3 file from a web server. This plays while downloading the whole file, which is fine over WiFi. But now I want it to work over 3G (and get it into the app store). How do I get it to just buffer the next 10 seconds or so (as per apple rules)? I'm digging through the documentation on AVPlayer, HTTP Live streaming, etc, but I'm still confused about the best way to do this. With so many podcast apps out there, I'm suprised there aren't more tutorials/libraries about it.
Thanks for your time.
I investigated this as well, and I was not able find a way to limit the look-ahead buffer using MPMoviePlayerController. I believe you would have to load chunks at the network layer and feed them in at the AVFoundation layer, but I have not attempted this myself.
That said, I can confirm that you can get an app approved that plays mp3 files using MPMoviePlayerController over both WiFi and 3G connections. In my app I added a setting so the user can decide whether to enable mp3 downloads over 3G or not, although I don't know if that was needed to get approved. I provided it so users didn't inadvertently incur bandwidth costs.