Video plays very fast some times in avplayer - ios

In Avplayer, we apply the rate for video slow motion and negative frame also. if video ends and apply the rate to avplayer the video plays very fastly instead of given rate. i was struggled more than 2 days. Thanks

Try to call [avPlayer setRate:0.3] and [avPlayer play] at the same time, so the second call override the first call, which sets the rate to be 0 (normal).
The method setRate: already set the rate for the player and play the video already

Related

How to increase the AVplayer load rate

When I used the player I found that the AVplayer buffer reached a certain time but the player did not enter the prepared state

Synchonizing Looping Video to Custom Audio

WWDC2012's Real-Time Media Effects and Processing during Playback Session explains how to synchronize an AVPlayer with custom audio. Paraphrasing, you
start playback of the AVPlayer and custom audio at the same time
play both at the same rate
The first part is achieved by priming the AVPlayer with prerollAtRate:completionHandler:, so playback can be started with "minimal latency", and the second part by making the AVPlayer use The iOS Audio Clock.
The code snippet assumes you have calculated the future host time that you anticipate audio hitting the speaker (taken literally this last phrase seems to imply supreme omniscience, so let's just read it as your [desired] audio start hosttime).
CMClockRef audioClock = NULL;
OSStatus err = CMAudioClockCreate(kCFAllocatorDefault, &audioClock);
if (err == noErr) {
[myPlayer setMasterClock:audioClock];
[myPlayer prerollAtRate:1.0 completionHandler:^(BOOL finished){
if (finished) {
// Calculate future host time here
[myPlayer setRate:1.0 time:newItemTime atHostTime:hostTime];
} else {
// Preroll interrupted or cancelled
}
}];
}
It's a tiny amount of code, yet it raises so many questions. What happens if the preroll currentTime and newItemTime don't agree? Don't video and audio play at the same rate of one second per second? So shouldn't their clocks be the same? Doesn't 48kHz divide 60fps? How can the code only need to know the desired start time and no other details of my audio code? Is it due to the one iOS Audio Clock? Is this API ingenious or an awful non-orthogonal mish-mash that won't compose well with other AVFoundation features?
Despite my doubts, the code seems to work, but I want to seamlessly loop the video and custom audio. The question is how?
I can't preroll the playing AVPlayer because that happens from currentTime (and the player wouldn't appreciate having its buffers changed while playing). Maybe an alternating pair of prerolled AVPlayers? AVPlayerLooper sounds promising. It's not actually an AVPlayer, but it wraps an AVQueuePlayer (which is). Assuming preroll works on the AVQueuePlayer and I pay extra special attention to looping the custom audio, then this may work. Otherwise I think the only remaining option is to drop the prerolling and shoehorn the video and custom audio into an audio tap within an AVComposition, which would be looped with the help of an AVPlayerLooper.

Synchronize multiple AVPlayers

I am trying to find a solution for a problem I got. I have 5 UIViews, which are all at the same position. Each UIView holds an AVPlayer with different videos. To be more precise they are all the same video, but encoded with different playback speed.
Video 1x speed
Video 4x speed
Video 8x speed
Video 16x speed
Video 24x speed
By default the video 1 is visible and playing, but I should be able to switch between the videos, but the switching shouldn't be visible for the user, therefore, I need to keep them synchronized. So If I am watching video 1 and switch to video 2, then video 2 should play exactly at the position, where video 1 stopped.
The sense is that it should look like, that the video is speeding up after an action, eg. a flick gesture.
I hope I described my issue good enough and I am very thankful for any suggestion.
I am using in the end an observer, which takes snapshots of the currentTime each 5 seconds, and calls all the other AVPLayer with the seekToTime method. This works fine to keep them synchronized. I just needed to adapt the CMTime for each player with different speed. As an example I got here 4x video:
videoPosition = player1.currentTime; //gets the video duration
videoPositionInSeconds = (Float64) videoPosition.value/videoPosition.timescale; //transfers the CMTime duration into seconds
[player2 seekToTime: CMTimeMakeWithSeconds(videoPositionInSeconds/4.0, player1.currentItem.asset.duration.timescale) toleranceBefore: kCMTimeZero toleranceAfter: kCMTimeZero];
Hope this helps.

AVPlayer seekToTime only generates frame every second

how can I make AVPlayerLayer display the exact video's frame in method seekToTime.
Now [AVPlayer seekToTime:CMTimeMakeWithSeconds(CMTimeGetSeconds(A, B))], the AVPlayerLayer only displays the frame at every 1-second change for example 1.50-2.50-3.50. I want it to display frame at 4.45 second for example. Is it possible?
You can use [AVPlayer seekToTime: toleranceBefore: toleranceAfter:] to get random media access with higher precision.
To get the highest precision possible, pass kCMTimeZero as argument for both tolerances. Note that this might add noticeable delay during seeks.

MPMoviePlayerController play video starting from 30th sec (from mid of the video)

I am creating a simple media player using MPMoviePlayerController, here I want to play the videos from particular point like from 30th sec or 50th second onwards and also want to move the video player head to any particular point and start playing it from that point. I tried with initialPlaybackTime and MPMoviePlaybackStateSeekingForward but no luck.
video is not from local, its from server.
Please help me to do the same.
MPMoviePlayerController *mp;
mp.initialPlaybackTime = 84;
mp.endPlaybackTime = 118;
It will start movie playback from 84th second till the 118th second.

Resources