In my iPhone app I have designed a custom video player, currently it is very basic with just a play pause and stop button,
but I would like the user to be able to scrub, (I think thats the right word) the video like you can do with apple's original media player.
So for Instance I would like to be able to take a UISlider and have it control the current postiion of the videos playback if you get what I mean. Oh and incase your curious, the way I pause/play/stop the video is by using this simple piece of code [self.theMovie play]; [self.theMovie stop]; [self.theMovie pause]; The trouble is I don't know how to scrub the video.
Any help appreciated.
I've asked the same question: customize quicktime iphone and here MPVideoPlayer add/remove buttons
It seams that you have to posibilities:
You can add you view over the main window. An sample can be found here: MoviePlayer Sample
You can iterate through views, found one you need and Add/Remove views. I don't know yet how must does Apple likes/dislikes this method.
Related
Goal:
To add youtube like preview feature when user seeks manually using the player seek bar.
From what I understand so far is that I will have to add "I-Frame only playlist" to my stream to enable trick play but I am not able to figure out how I will be using this to show the preview view on the video player?
Other solutions I considered:
AVAssetImageGenerator: It does not work on streams. Explained here.
This says if my .m3u8 file contains "I-Frame only playlist", AVAssetImageGenerator will start returning the snapshot, but even if it does, generating thumbnails of a complete 1 hour video upfront is just not optimal.
AVPlayerItemVideoOutput This also seems like a very brut force way to approach the problem as I need thumbnails of almost complete video.
Current player implementation:
I have added AVPlayerLayer as a sublayer to my view controller's view and added custom controls on top of it.
I am thinking of using something like this https://github.com/pbs/iframe-playlist-generator to add the I-Frame playlist.
PS: I am new to this, so if I have made any wrong assumption, please let me know.
Also, any links or references to some reading material I can use to dive in deeper are appreciated. Thanks.
I'm creating AVPlayer view to watch the stream from CCTV. And I want to make playback controls in a video like in Photos app at iPhone(open Photos, play some video, at the bottom, it will be UIScrollView with some screenshots of your video) like at image below.
May I ask you some help on how to create something like this?
And this is my first question here, so sorry if something wrong. =)
ScrollView playback
AVPlayer cannot do this for streaming. The suggestion made on snapshots would also not be close to your want of getting the same behaviour as on local preview of videos in the Photos App on iOS.
If you are looking for this kind of experience I would recommend heading over to movi.ai to get your hands on our cross platform solution with this ready out of the box.
I am new to iOS app developing field. I am trying to capture video using AVFoundation. I am successful in this. But when I tried to play the video back using MPMoviePlayerController, I got too many issues. So I am trying to play using AVPlayer.
But in AVPlayer there are two approaches AVPlayerLayer and AVPlayerViewController.
I tried searching about those, but I didn't get any particular reason to choose one.
Can anyone suggest me which is better to use?
AVPlayerViewController is an all in one solution. You setup your AVPlayer with a video and present the player controller. It handles all the playing and has its own controls baked in (I'm sure you've used seen this in other apps). It is the simplest way to show a video.
AVPlayerLayer is for when you want to add some customization, like adding your own controls or extra views, or not making the video full screen.
I Need to play a video when the view load but, other code i have try to do the video player comes up. I want to play the video in the UIVIEW. Can someone please help!
You will need to use an AVPlayer, specifying an AVPlayerItem. This should allow you to use multiple AV items within a single view.
Basically, everything is explained here:
https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html
I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!