MPMoviePlayer playing audio but not video when seeking into a file - ios

I'm trying to seek into a video file at a certain point. Lets say the video is 5 minutes long and I'm jumping in at 110 seconds.
When I play from the beginning, everything plays through fine, however, when I try to seek into the file, I can hear the audio but I can't see the video. I first thought this was maybe an issue with the order I'm loading the subviews but I can still see (and use) the controls for the player. Sliding back to 0:00 starts the video.
The following is code from my video class. The initIntoView method accepts a UIView and then returns an amended copy which then gets written to the main view. Sorry in advance for the messy code. I'm still quite new to Objective-C.
Init the Video view
- (WWFVideo*) initIntoView: (UIView*) view withContent:(NSDictionary*)contentDict{
self=[super init];
viewRef=view;
contentData = contentDict;
NSURL *videoUrl = [[NSURL alloc]initWithString:[contentDict objectForKey:#"cnloc"]]; //Returns a HTTP link to my video file (MP4, H.246, AAC Audio)
videoController = [[MPMoviePlayerController alloc] init];
videoController.movieSourceType = MPMovieSourceTypeFile;
[videoController setContentURL:videoUrl];
videoController.view.frame = viewRef.bounds;
[videoController.view setAutoresizingMask:UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight];
[viewRef addSubview:videoController.view];
return self;
}
Start playing the video
-(void)play:(int)offset { //Offset is "110"
[videoController setInitialPlaybackTime:offset];
[videoController play];
}
I've tried adding the videoController to viewRef both before and after the video starts playing but it has the same outcome.
I've also tried using an MPMoviePlayerViewController with no avail.
Another thing I tried was changing the streaming type to MPMovieSourceTypeStreaming but it seemed to have no effect.
If I've missed any more vital code, just ask and I'll see what I can do.
Edit:
Xcode 4.6.3
iOS 6
Testing on an iPad 2
Edit #2:
Works perfectly on the simulator, just not on the device.

After trying to piece together a sample app to upload here, I found that the w3 version of Big Buck Bunny worked fine. This indicates it was an encoding problem and not an objective C issue.
I've re-encoded the same file I was trying to play before but now with the baseline profile with the following command:
ffmpeg -i {filename} -acodec aac -ac 2 -strict experimental -ab 160k -s {ssize} -vcodec libx264 -preset slow -profile:v baseline -level 30 -maxrate 10000000 -bufsize 10000000 -b 1200k -f mp4 -threads 0 {filename}.ipad.mp4
I found this code on here through this Stack Overflow post.
Primarily for low-cost applications that require additional data loss robustness, this profile is used in some videoconferencing and mobile applications. This profile includes all features that are supported in the Constrained Baseline Profile, plus three additional features that can be used for loss robustness (or for other purposes such as low-delay multi-point video stream compositing). The importance of this profile has faded somewhat since the definition of the Constrained Baseline Profile in 2009. All Constrained Baseline Profile bitstreams are also considered to be Baseline Profile bitstreams, as these two profiles share the same profile identifier code value.
-From Wikipedia
I realise this may not help anyone here looking for Objective-C help but if it saves just one person the 5 hours I spent today trying to get this working, this will be worth it.

Related

Read HLS Playlist information to dynamically change the preferredBitRate of an Item

I'm working on a video app, we are changing form regular mp4 files to HLS, one of the many reasons we have to do the change is that we hace much more control over the bandwidth usage of videos (we load lots of other stuff in our player, so we need to optimize the experience the best way).
So, AVFoundation introduced in iOS10 the ability to control the bandwidth using:
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:self.urlAsset];
playerItem.preferredForwardBufferDuration = 30.0;
playerItem.preferredPeakBitRate = 200000.0; // Remember this line
There's also a configuration introduced on iOS11 to set the maximum resolution of the item with preferredMaximumResolution, So we're using it, but we still need a solution for iOS10 devices.
Well, now we have control over the preferredPeakBitRate that's nice, but we have a problem, not all the HLS sources are generated by us, so, let's say we want to set a maximum resolution of 480p when you're not connected to a wifi network, today I don't have way to achieve that, not always I'm going to be able to know how much bandwidth needs the 480p source for the selected HLS playlist.
One thing I was thinking about is to read the information inside the m3u8 file, to at least know which are the different quality sources that my player can show and how much bandwidth needs everyone.
One way to do this, would download the m3u8 playlist as a plain text, use a regex to read the file and process this data, well, I'm trying to avoid that, I think that this should far less difficult.
I cannot read this information from the tracks, because a) I can't find the information, b) the tracks are replaced dynamically when changing the quality, yeah 1 track for every quality level.
So, I don't know how I can get this information, I've searched google, stackoverflow and I can't find this information, does any one can help me?
Here's an example for what I want to do, I have this example playlist:
#EXTM3U
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=314000,RESOLUTION=228x128,CODECS="mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_192k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=478000,RESOLUTION=400x224,CODECS="avc1.42001e,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_400k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=691000,RESOLUTION=480x270,CODECS="avc1.42001e,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_600k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1120000,RESOLUTION=640x360,CODECS="avc1.4d001f,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_1000k.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1661000,RESOLUTION=960x540,CODECS="avc1.4d001f,mp4a.40.2"
test-hls-1-16a709300abeb08713a5cada91ab864e_hls_duplex_1500k.m3u8
And I just want to have that information available on an array inside my code, something like this:
NSArray<ZZMetadata *> *metadataArray = self.urlAsset.bandwidthMetadata;
NSLog(#"Metadata info: %#", metadataArray);
And print something like this:
<__NSArrayM 0x123456789> (
<ZZMetadata 0x234567890> {
trackId: 1
neededBandwidth: 314000
resolution: 228x128
codecs: ...
...
}
<ZZMetadata 0x345678901> {
trackId: 2
neededBandwidth: 478000
resolution: 400x224
}
...
}

Stream video while downloading iOS

I am using iOS 7 and I have a .mp4 video that I need to download in my app. The video is large (~ 1 GB) which is why it is not included as part of the app. I want the user to be able to start watching the video as soon as is starts downloading. I also want the video to be able to be cached on the iOS device so the user doesn't need to download it again later. Both the normal methods of playing videos (progressive download and live streaming) don't seem to let you cache the video, so I have made my own web service that chunks up my video file and streams the bytes down to the client. I start the streaming HTTP call using NSURLConnection:
self.request = [[NSMutableURLRequest alloc] initWithURL:self.url];
[self.request setTimeoutInterval:10]; // Expect data at least every 10 seconds
[self.request setHTTPMethod:#"GET"];
self.connection = [[NSURLConnection alloc] initWithRequest:self.request delegate:self startImmediately:YES];
When I receive a data chunk, I append it to the end of the local copy of the file:
- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)data
{
NSFileHandle *handle = [NSFileHandle fileHandleForWritingAtPath:[self videoFilePath]];
[handle truncateFileAtOffset:[handle seekToEndOfFile]];
[handle writeData:data];
}
If I let the device run, the file is downloaded successfully and I can play it using MPMoviePlayerViewController:
NSURL *url=[NSURL fileURLWithPath:self.videoFilePath];
MPMoviePlayerViewController *controller = [[MPMoviePlayerViewController alloc] initWithContentURL:url];
controller.moviePlayer.scalingMode = MPMovieScalingModeAspectFit;
[self presentMoviePlayerViewControllerAnimated:controller];
However, if I start the player before the file is completely downloaded, the video starts playing just fine. It even has the correct video length displayed at the top scrubber bar. But when the user gets to the position in the video that I had completed downloading before the video started, the video just hangs. If I close and reopen the MPMoviePlayerViewController, then the video plays until it gets to whatever location I was then at when I launched the MPMoviePlayerViewController again. If I wait until the entire video is downloaded, then the video plays without a problem.
I am not getting any events fired, or error messages printed to the console when this happens (MPMoviePlayerPlaybackStateDidChangeNotification and MPMoviePlayerPlaybackDidFinishNotification are never sent after the video starts). It seems like there is something else that is telling the controller what the length of the video is other than what the scrubber is using...
Does anyone know what could be causing this issue? I am not bound to using MPMoviePlayerViewController, so if a different video playback method would work in this situation I am all for it.
Related Unresolved Questions:
AVPlayer and Progressive Video Downloads with AVURLAssets
Progressive Video Download on iOS
How to play an in downloading progress video file in IOS
UPDATE 1
I have found that the video stall is indeed because of the file size when the video starts playing. I can get around this issue by creating a zero-ed out file before I start the download and over overwrite it as I go. Since I have control over the video streaming server, I added a custom header so I know the size of the file being streamed (default file size header for a streaming file is -1). I am creating the file in my didReceiveResponse method as follows:
- (void)connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response
{
// Retrieve the size of the file being streamed.
NSHTTPURLResponse *httpResponse = (NSHTTPURLResponse *)response;
NSDictionary *headers = httpResponse.allHeaderFields;
NSNumberFormatter * formatter = [[NSNumberFormatter alloc] init];
[formatter setNumberStyle:NSNumberFormatterDecimalStyle];
self.streamingFileSize = [formatter numberFromString:[headers objectForKey:#"StreamingFileSize"]];
// Check if we need to initialize the download file
if (![[NSFileManager defaultManager] fileExistsAtPath:self.path])
{
// Create the file being downloaded
[[NSData data] writeToFile:self.path atomically:YES];
// Allocate the size of the file we are going to download.
const char *cString = [self.path cStringUsingEncoding:NSASCIIStringEncoding];
int success = truncate(cString, self.streamingFileSize.longLongValue);
if (success != 0)
{
/* TODO: handle errors here. Probably not enough space... See 'man truncate' */
}
}
}
This works great, except that truncate causes the app to hang for about 10 seconds while it creates the ~1GB file on disk (on the simulator it is instant, only a real device has this problem). This is where I am stuck now - does anyone know of a way to allocate a file more efficiently, or a different way to get the video player to recognize the size of the file without needing to actually allocate it? I know some filesystems support "file size" and "size on disk" as two different properties... not sure if iOS has something like that?
I figured out how to do this, and it is much simpler than my original idea.
First, since my video is in .mp4, the MPMoviePlayerViewController or AVPlayer class can play it directly from a web server - I don't need to implement anything special and they can still seek to any point in the video. This must be part of how the .mp4 encoding works with the movie players. So, I just have the raw file available on the server - no special headers required.
Next, when the user decides to play the video I immediately start playing the video from the server URL:
NSURL *url=[NSURL fileURLWithPath:serverVidelFileURLString];
controller = [[MPMoviePlayerViewController alloc] initWithContentURL:url];
controller.moviePlayer.scalingMode = MPMovieScalingModeAspectFit;
[self presentMoviePlayerViewControllerAnimated:controller];
This makes it so the user can watch the video and seek to any location they want. Then, I start downloading the file manually using NSURLConnection like I had been doing above, except now I am not streaming the file, I just download it directly. This way I don't need the custom header since the file size is included in the HTTP response.
When my background download completes, I switch the playing item from the server URL to the local file. This is important for network performance because the movie players only download a few seconds ahead of what the user is watching. Being able to switch to the local file as soon as possible is key to avoid downloading too much duplicate data:
NSTimeInterval currentPlaybackTime = videoController.moviePlayer.currentPlaybackTime;
[controller.moviePlayer setContentURL:url];
[controller.moviePlayer setCurrentPlaybackTime:currentPlaybackTime];
[controller.moviePlayer play];
This method does have the user downloading two video files at the same time initially, but initial testing on the network speeds my users will be using shows it only increases the download time by a few seconds. Works for me!
You gotta create an internal webserver that acts like a proxy! Then set your player to play the movie from the localhost.
When using HTTP protocol to play a video with MPMoviePlayerViewController, the first thing the player does is to ask for the byte-range 0-1 (first 2 bytes) just to obtain the file length. Then, the player asks for "chunks" of the video using the "byte-range" HTTP command (the purpose is to save some battery).
What you have to do is to implement this internal server that delivers the video to the player, but your "proxy" must consider the length of your video as the full length of the file, even if the actual file hasn't been completely downloaded from the internet.
Then you you set your player to play a movie from " http:// localhost : someport "
I've done this before... it works perfectly!
Good luck!
I can only assume that the MPMoviePlayerViewController caches the file length of the file when you started it.
The way to fix (just) this issue is to first determine how large the file is. Then create a file of that length. Keeping an offset pointer, as the file downloads, you can overwrite the "null" values in the file with the real data.
So you get to a specific point in the download, start the MPMoviePlayerViewController, and let it run. I'd also suggest you use the "F_NOCACHE" flag (with fcntl()) so you bypass the file block cache (which means you will lower your memory footprint).
The downside to this architecture is that if you get stalled, and the movie player gets ahead of you, well, the user is going to have a pretty bad experience. Not sure if there is any way for you to monitor and take preemptive action.
EDIT: its quite possible that the video is not read sequentially, but certain information requires the player to essentially look ahead for something. If so, then this is doomed to fail. The only other possible solution is to use some software tool to sequentially order the file (I'm no video expert so cannot comment from experience on any of the above).
To test this out, you can construct a "damaged" video of varying lengths, and test that to see what works and what does not. For instance, suppose you have a 100Meg file. Write a little utility program, and over write the last 50Megs of data with zeros. Now play this video. Its should fail 1/2 through. If it fails right away, well, you now know that its seeking in the file.
If non sequential, its possible that its looking at the last 1000 bytes or so, in which case if you don't overwrite that things work as you want. If you get lucky and this is the case, you would eventually download the last 1000 bytes, then then start from the front of the file.
It really gets down to finding some way before introducing real networking into the picture, to play a partial file. You will surely find it easier to artificially introduce the networking conditions without really doing it real time.

HTTP LIve Streaming

Ok, I have been trying to wrap my head around this http live streaming. I just do not understand it and yes I have read all the apple docs and watched the wwdc videos, but still super confused, so please help a wanna be programer out!!!
The code you write goes on the server? not in xcode?
If I am right how do i set this up?
Do I need to set up something special on my server? like php or something?
How do use the tools that are supplied by Apple.. segmenter and such?
Please help me,
Thanks
HTTP Live Streaming
HTTP Live Streaming is a streaming standard proposed by Apple. See the latest draft standard.
Files involved are
.m4a for audio (if you want a stream of audio only).
.ts for video. This is a MPEG-2 transport, usually with a h.264/AAC payload. It contains 10 seconds of video and it is created by splitting your original video file, or by converting live video.
.m3u8 for the playlist. This is a UTF-8 version of the WinAmp format.
Even when it's called live streaming, usually there is a delay of one minute or so during which the video is converted, the ts and m3u8 files written, and your client refresh the m3u8 file.
All these files are static files on your server. But in live events, more .ts files are added, and the m3u8 file is updated.
Since you tagged this question iOS it is relevant to mention related App Store rules:
You can only use progressive download for videos smaller than 10 minutes or 5 MB every 5 minutes. Otherwise you must use HTTP Live Streaming.
If you use HTTP Live Streaming you must provide at least one stream at 64 Kbps or lower bandwidth (the low-bandwidth stream may be audio-only or audio with a still image).
Example
Get the streaming tools
To download the HTTP Live Streaming Tools do this:
Get a Mac or iPhone developer account.
Go to https://developer.apple.com and search for "HTTP Live Streaming Tools", or look around at https://developer.apple.com/streaming/.
Command line tools installed:
/usr/bin/mediastreamsegmenter
/usr/bin/mediafilesegmenter
/usr/bin/variantplaylistcreator
/usr/bin/mediastreamvalidator
/usr/bin/id3taggenerator
Descriptions from the man page:
Media Stream Segmenter: Create segments from MPEG-2 Transport streams for HTTP Live Streaming.
Media File Segmenter: Create segments for HTTP Live Streaming from media files.
Variant Playlist Creator: Create playlist for stream switching from HTTP Live streaming segments created by mediafilesegmenter.
Media Stream Validator: Validates HTTP Live Streaming streams and servers.
ID3 Tag Generator: Create ID3 tags.
Create the video
Install Macports, go to the terminal and sudo port install ffmpeg. Then convert the video to transport stream (.ts) using this FFMpeg script:
# bitrate, width, and height, you may want to change this
BR=512k
WIDTH=432
HEIGHT=240
input=${1}
# strip off the file extension
output=$(echo ${input} | sed 's/\..*//' )
# works for most videos
ffmpeg -y -i ${input} -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s ${WIDTH}x${HEIGHT} -vcodec libx264 -b ${BR} -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 7 -trellis 0 -refs 0 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate ${BR} -bufsize ${BR} -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 30 -qmax 51 -qdiff 4 -level 30 -aspect ${WIDTH}:${HEIGHT} -g 30 -async 2 ${output}-iphone.ts
This will generate one .ts file. Now we need to split the files in segments and create a playlist containing all those files. We can use Apple's mediafilesegmenter for this:
mediafilesegmenter -t 10 myvideo-iphone.ts
This will generate one .ts file for each 10 seconds of the video plus a .m3u8 file pointing to all of them.
Setup a web server
To play a .m3u8 on iOS we point to the file with mobile safari.
Of course, first we need to put them on a web server. For Safari (or other player) to recognize the ts files, we need to add its MIME types. In Apache:
AddType application/x-mpegURL m3u8
AddType video/MP2T ts
In lighttpd:
mimetype.assign = ( ".m3u8" => "application/x-mpegURL", ".ts" => "video/MP2T" )
To link this from a web page:
<html><head>
<meta name="viewport" content="width=320; initial-scale=1.0; maximum-scale=1.0; user-scalable=0;"/>
</head><body>
<video width="320" height="240" src="stream.m3u8" />
</body></html>
To detect the device orientation see Detect and Set the iPhone & iPad's Viewport Orientation Using JavaScript, CSS and Meta Tags.
More stuff you can do is create different bitrate versions of the video, embed metadata to read it while playing as notifications, and of course have fun programming with the MoviePlayerController and AVPlayer.
This might help in swift:
import UIKit
import MediaPlayer
class ViewController: UIViewController {
var streamPlayer : MPMoviePlayerController = MPMoviePlayerController(contentURL: NSURL(string:"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"))
override func viewDidLoad() {
super.viewDidLoad()
streamPlayer.view.frame = self.view.bounds
self.view.addSubview(streamPlayer.view)
streamPlayer.fullscreen = true
// Play the movie!
streamPlayer.play()
}
}
MPMoviePlayerController is deprecated from iOS 9 onwards. We can use AVPlayerViewController() or AVPlayer for the purpose. Have a look:
import AVKit
import AVFoundation
import UIKit
AVPlayerViewController :
override func viewDidAppear(animated: Bool){
let videoURL = NSURL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")
let player = AVPlayer(URL: videoURL!)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.presentViewController(playerViewController, animated: true) {
playerViewController.player!.play()
}
}
AVPlayer :
override func viewDidAppear(animated: Bool){
let videoURL = NSURL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")
let player = AVPlayer(URL: videoURL!)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.view.bounds
self.view.layer.addSublayer(playerLayer)
player.play()
}
Another explanation from Cloudinary http://cloudinary.com/documentation/video_manipulation_and_delivery#http_live_streaming_hls
HTTP Live Streaming (also known as HLS) is an HTTP-based media streaming communications protocol that provides mechanisms that are scalable and adaptable to different networks. HLS works by breaking down a video file into a sequence of small HTTP-based file downloads, with each download loading one short chunk of a video file.
As the video stream is played, the client player can select from a number of different alternate video streams containing the same material encoded at a variety of data rates, allowing the streaming session to adapt to the available data rate with high quality playback on networks with high bandwidth and low quality playback on networks where the bandwidth is reduced.
At the start of the streaming session, the client software downloads a master M3U8 playlist file containing the metadata for the various sub-streams which are available. The client software then decides what to download from the media files available, based on predefined factors such as device type, resolution, data rate, size, etc.

Encoding SWF to video with Melt

I'm doing a project which requires converting SWF movies to H.264 video on server-side, to be able to play them both in Flash player and on iPhone/iPad. And I really got stuck.
I'm using Melt from http://www.mltframework.org/ and this is my command-line:
melt movie.swf -consumer avformat:video.mp4 r=30 s=640x360 f=mp4 acodec=aac ab=128k ar=48000 vcodec=libx264 b=1000k an=1
It does play in Flash player, but fails to play on iDevices. I googled for iPhone video requirements and it seems my video files do satisfy them(frame size, framerate and bitrate). What settings should I change to make it play?
I've spent a lot of time in google but managed to gather all the pieces, so these are parameters that work for iPhone:
r=30 s=640x360 f=mp4 acodec=aac ab=128k ar=48000 vcodec=libx264 level=30 b=1024k flags=+loop+mv4 cmp=256 partitions=+parti4x4+parti8x8+partp4x4+partp8x8+partb8x8 me_method=hex subq=7 trellis=1 refs=1 bf=0 flags2=+mixed_refs-wpred-dct8x8 coder=0 wpredp=0 me_range=16 g=250 keyint_min=25 sc_threshold=40 i_qfactor=0.71 qmin=10 qmax=51 qdiff=4 maxrate=10M bufsize=10M an=1 threads=0
Also, I use faac -w to convert audio to appropriate format and MP4Box to join video and sound.

NetStream.Play.Failed with VideoTexture on iOS in Adobe Air App

I'm trying to get a video to play in an Away3d texture on iOS. It's fine on Android and Windows. The video will play in Starling on iOS so I know it's not the video.
Here is how I add the video:
sphereGeometry = new SphereGeometry(5000, 64, 48);
panoTextureMaterial = new TextureMaterial(panoTexture2DBase, false, false, false);
panoVideoMesh = new Mesh(sphereGeometry, panoTextureMaterial);
panoVideoMesh.scaleX *= -1;
panoVideoMesh.rotate(Vector3D.Y_AXIS,-90);
scene.addChild(panoVideoMesh);
panoTexture2DBase.player.play();
view.render();
On iOS I get this from the netstats when I try and load it as a video texture.
NetStream.Play.Start
NetStream.Play.Failed
NetStream.Play.Stop
I'm using the Away3d NativeVideoTexture class
texture = context.createVideoTexture();
texture.attachNetStream(_player.ns);
I think it might be do with MP4 encoding, and I've had a good look around and can't find anything that works, currently I'm trying this in FFMEG
-vcodec libx264 -profile:v main -level 3.1 -crf 23 -s 1024:768 -movflags +faststart
But what I set doesn't seem to make a lot of difference.
Any idea why my video is failing to load as a VideoTexture on iOS?

Resources