I use AVPlayer to play .MOV videos stored on my dedicated server.
When an user wants to play a video, i load it on AVPlayer using a direct link like this : "wwww.myserver.com/videos/video.mov".
(I don't use php file, maybe i should..)
Generally videos take 1-2 seconds to start playing, but sometimes it can be very slow(until 1 minutes)
However, once the video started, the loading is very quick, but the start can be long, very long (even with fast connection).
Videos are small (maximum 6Mo), i compressed them using SDAVAssetExportSession.
Also I've disabled App Transport Security.
The issue can be server side ? i really don't know how solve this problem
Any help is appreciated
edit 1 : link to video on my server
Related
I have a list of videos in my iOS app, which I fetch from my API. The list contains video nodes with an mp4 video file url on the server, and as the user scrolls from one video to the next, the next video starts and the earlier video pauses. This all works well, but I need a way to cache mp4 videos to disk, in such a way that when the user tries to seek, I first try serving the sought chunk from my disk cache, and if this is not cached already, I start loading the chunk from the network, and then the chunks are cached to disk while being played. I also need a way to pre cache the next videos, for example if the user starts playing first video, I need to be able to start caching the second video, so that when the second video starts playing, it does so from the cache, to the point upto which the video is cached, and then seamlessly hop to fetching from the server (and of course caching the responses). How do I go about this?
There's no automated way to do this built into the framework. You'll use background NSURLSessions with download tasks and delegate methods.
You can monitor the progress of the downloads, access the temporary files while they download, pause, resume, and cancel them.
In my Application I'm capturing 10 second of video and upload to server by FTP and Other user can watch this video by URL(s) from web service response.
As per my question I want to capture video with good quality with small size so It easy to upload video by FTP. Right now I'm using .mp4 video formate to upload video if anybody know best video formation that can increase uploading speed then guide me.
Second I'm getting all those uploaded video in the response of web service as a URL(s).
In response there are many URLs so I need to play video in queue means one-by-one and end user can do swipe left to move on NEXT video and swipe RIGHT to previous video. You can see my code here.
Every thing is working good but problem is take much time to upload and play (buffering time) video.
Please guide me on this points.
COMMENT : Get success in the compress video 20 MB to 1.6 MB so uploading speed bit increase and Right now working with AVQueuePlayer for play video in queue but some time stuck video to playing.
Yes,
May you upload video using chunk data in base64 format.
This is faster then FTP video upload & also useful when your internet connection is lost same time when you upload video in sequence.
You can upload your files via SFTP, it is speed than FTP I think. Also your videos may bad performance with plays on http protocol.
You should follow this lines, I hope it will help you;
Upload video via SFTP or Amazon S3
Install to your server a stream engine like Wowza or Red5
Transcode video for mobile (Usually wowza makes auto)
Stream your videos over rtsp for android, hls (http) for ios
That its!
Good luck
I'm working in an app where I'm able to play a HLS m3u8 playlist of a streaming radio (audio only) without any problem using an instance of AVPlayer. Using Charles I can see how the playlist is updated properly in a normal pace (each 9-10 seconds, which takes one media segment file). When I perform a seekToTime: (back in time), the player success playing the stream from when I want to, but in Charles I observe how the player starts dowloading a huge amount of media segment files, consuming a lot of data. It seems that the player downloads all the media segment files until that time and then keeps again with the normal behaviour.
I understand that the correct behaviour would be to download the media segment file for the time I'm seeking to, start playing it and then download constantly 1 or 2 media segment files each 9-10 seconds, as it does when I play the stream without timeshift.
My question is if this is a normal behaviour, or if something could be wrong with my m3u8 playlist or the client implementation. Anyone could help me to clarify this?
UPDATED: I can confirm this doesn't happen in iOS 7, so it seems to be a bug introduced by iOS 8.
I've been told by Apple that this is not a bug, but a feature. They've made the buffer bigger since iOS 8.
We've got an app we're working on that needs to provide playback of video files via AVPlayer. The files need to be stored on the user's device, but also must playback while downloading.
At the moment we've built a download module that uses the ASIHTTPRequest library to get the video files via PHP (we don't want the media to be linkable via public URLs) and write them to disk asynchronously.
We've then setup an AVPlayer as per the AV Foundation Programming Guide, getting the file with AVURLAsset, making an AVPlayerItem with the asset, building the AVPlayer with the item, then setting the player to an AVPlayerLayer. The player runs fine with a fully downloaded video file and will also run a progressively downloaded file perfectly well in the simulator.
Unfortunately on an actual device, the behavior of the player changes, where instead it seemingly loads the video once and doesn't attempt to grab new packets from disk. The result is that the player will play video and audio up to the point in the video that marks where the download progress was at the time the asset was loaded (e.g. if 2MB of data are buffered then the player is created, the player will only play up to the 2MB worth of data). Because it has the video's header, the player will happily continue thinking it's playing for the full duration of the video, but no other video is rendered to screen.
The last wrinkle is that if the video is inserted into an AVComposition and the AVPlayer is created with that, the progressive video will play fine. This would be a fine solution (and is necessary for our app anyway on occasion) except that the client for our app requires that the video be playable on an Apple TV via AirPlay, which AVCompositions are incapable of.
Does anyone know if there is a way to play progressively downloading video files using an AVPlayer built from AVURLAssets? Is there a way to force the player/playerItem to read from disk with an AVURLAsset the way it seems to do with an AVComposition instead of seemingly caching the video in memory?
Thanks!
I haven't a solution to just make it work wit AVURLAssets but I use a slightly different approach. We bundle our App with CocoaHTTPServer and play video files which aren't fully downloaded trough a HTTP request against the local server.
The server knows the total length of the file and can then decide by looking at the HTTP-Headers which part of the file is request and either loads it from disk or from remote source.
While developing this there where always 3 initial requests, one for the first two bytes of the file, one of a larger chunk from the beginning of the file and one chunk directly of the end of the file. That's why it was always needed to load at least the last part directly from the remote server since the player would need it right from the start. I would guess the same happens for local files so the player loads the last bytes from the file (which aren't the right last bytes) and won't play over that length.
You would have to subclass HTTPConnection and make your own HTTPResponse class by looking at the provided "HTTPAsyncFileResponse".
I hope this gives you an idea how to accomplish this with a different approach.
In my app I use MPMoviePlayerController to play an mp3 file from a web server. This plays while downloading the whole file, which is fine over WiFi. But now I want it to work over 3G (and get it into the app store). How do I get it to just buffer the next 10 seconds or so (as per apple rules)? I'm digging through the documentation on AVPlayer, HTTP Live streaming, etc, but I'm still confused about the best way to do this. With so many podcast apps out there, I'm suprised there aren't more tutorials/libraries about it.
Thanks for your time.
I investigated this as well, and I was not able find a way to limit the look-ahead buffer using MPMoviePlayerController. I believe you would have to load chunks at the network layer and feed them in at the AVFoundation layer, but I have not attempted this myself.
That said, I can confirm that you can get an app approved that plays mp3 files using MPMoviePlayerController over both WiFi and 3G connections. In my app I added a setting so the user can decide whether to enable mp3 downloads over 3G or not, although I don't know if that was needed to get approved. I provided it so users didn't inadvertently incur bandwidth costs.