I have been using MPMovieplayer and the playableDuration to check the available duration of a movie.
The duration always seems to be ~1 second further than my current duration and basically I would like to increase this.
I have tried to use the prepareToPlay but this seems to do nothing noticeable to the playable Duration.
I have tried to set as many parameters as possible to attempt to try and change the value via setting the defaults pre-emptively such as the MPMovieSourceType, MediaType and alike, but all to no avail.
Just to clear a few things up firstly: I am using both MPMoviePlayer and AVplayer which both play different streams simultaneously as the video/audio I am using is split.
EDIT
Seems like I overlooked the file size affecting the stream and should have read more in the apple resources then elsewhere, but as far as I can tell the issue is: the file size is too large and therefore a server side media segmenter has to be implemented.
Apple Resource on Media Segmenting
Related
After spending some time setting up a transcoding process on AWS I am finding that the loading times for videos has not been lowered as expected with HLS (m3u8).
It seems that if I am using AVPlayer directly, without AVPlayerViewController, I a may need to do the managing of the video stream quality myself? My understanding was that if I had a m3u8, that things would be done automatically and the best quality would be used depending on network conditions / device / etc?
So far it seems that the loading times are the same if not slightly worse than without the m3u8 if AVPlayer is used as is.
To better understand what's going on I've been trying out a few things.
1) While doing this has worked to reduce loading times, I would prefer to do a bit more than just lower it all the way when not on wfifi:
self.player?.currentItem?.preferredPeakBitRate = 1
This seems to give me a pretty low quality video but it loads pretty quickly. I have yet to figure out how to detect the actual bitrate being used though (since setting this value has improved loading times dramatically, I am going to assume AVPlayer does not handle the adjustments on its own?).
2) Also, haven't had any luck with (causes infinite spinner, even with the preferredPeakBitRate set to 1):
self.player.automaticallyWaitsToMinimizeStalling = false
3) I am open to using a third party library that might handle this, found something called VKVideoPlayer that might do some of this?
Thanks
It's possible now in iOS8 and onwards.
Following copied from Apple's documentation:
The desired limit, in bits per second, of network bandwidth
consumption for this item. SWIFT: var preferredPeakBitRate: Double
OBJECTIVE-C: #property(nonatomic) double preferredPeakBitRate
Set preferredPeakBitRate to non-zero to indicate that the player
should attempt to limit item playback to that bit rate, expressed in
bits per second.
If network bandwidth consumption cannot be lowered to meet the
preferredPeakBitRate, it will be reduced as much as possible while
continuing to play the item.
I have two solutions to this problem:
SOLUTION A
Convert the asset to an AVMutableComposition.
For every second keep only one frame , by removing timing for all the other frames using removeTimeRange(...) method.
SOLUTION B
Use the AVAssetReader to extract all individual frames as an array of CMSampleBuffer
Write [CMSampleBuffer] back into a movie skipping every 20 frames or so as per requirement.
Convert the obtained video file to an AVMutableComposition and use scaleTimeRange(..) to reduce overall timeRange of video for timelapse effect.
PROBLEMS
The first solution is not suitable for full HD videos , the video freezes in multiple place and the seekbar shows inaccurate timing .
e.g. A 12 second timelapse might only be shown to have a duration of 5 seconds, so it keeps playing even when the seek has finished.
I mean the timing of the video gets all messed up for some reason.
The second solution is incredibly slow. For a 10 minute HD video the memory would run upto infinity since all execution is done in memory.
I am searching for a technique that can produce a timelapse for a video right away , without waiting time .Solution A kind of does that , but is unsuitable because of timing problems and stuttering.
Any suggestion would be great. Thanks!
You might want to experiment with the inbuilt thumbnail generation functions to see if they are fast/effecient enough for your needs.
They have the benefit of being optimised to generate images efficiently from a video stream.
Simply displaying a 'slide show' like view of the thumbnails one after another may give you the effect you are looking for.
There is iinfomrtaion on the key class, AVAssetImageGenerator, here including how to use it to generate multiple images:
https://developer.apple.com/reference/avfoundation/avassetimagegenerator#//apple_ref/occ/instm/AVAssetImageGenerator/generateCGImagesAsynchronouslyForTimes%3acompletionHandler%3a
The app I’m working on loops a video a specified # of times by adding the same AVAssetTrack (created from the original video url) multiple times to the same AVComposition at successive intervals. The app similarly inserts a new video clip into an existing composition by 'removing' the time range from the composition's AVMutableCompositionTrack (for AVMediaTypeVideo) and inserting the new clip's AVAssetTrack into the previously removed time range.
However, occasionally and somewhat rarely, after inserting a new clip as described above into a time range within a repeat of the original looping video, there are resulting blank frames which only appear at the video loop’s transition points (within the composition), but only during playback - the video exports correctly without gaps.
This leads me to believe the issue is with the AVPlayer or AVPlayerItem and how the frames are currently buffered for playback, rather than how I'm inserting/ looping the clips or choosing the correct CMTime stamps to do so. The app is doing a bunch of things at once (loop visualization in the UI via an NSTimer, audio playback via Amazing Audio Engine) - could my issue be a result of competition for resources?
One more note: I understand that discrepancies between audio and video in an asset can cause glitches (i.e. the underlying audio is a little bit longer than the video length), but as I'm not adding an audioEncodingTarget to the GPUImageWriter that I'm using to record and save the video, the videos have no audio components.
Any thoughts or directions you can point me in would be greatly appreciated! Many thanks in advance.
Update: the flashes coincide with the "Had to drop a video frame" error logged by the GPUImage library, which according to the creator has to do with the phone not being able to process video fast enough. Can multi-threading solving this?
Update 2: So the flashes actually don't always correspond to the had to drop a video frame error. I have also disabled all of the AVRecorder/Amazing Audio Engine code and the issue still persists making it not a problem of resource competition between those engines. I have been logging properties of AVPlayer item and notice that the 'isPlayBackLikelyToKeepUp' which is always NO, and 'isPlaybackBufferFull' which is always yes.
So problem is solved - sort of frustrating how brutally simple the fix is. I just used a time range a frame shorter for adding the videos to the composition rather than the AVAssetTrack's time range. No more flashes. Hopefully the users won't miss that 30th of a second :)
shortened_duration = CMTimeSubtract(originalVideoAssetTrack.timeRange.duration, CMTimeMake(1,30));
I have just built VLC library for iOS at VLCKit
and using it to display a video stream. I need to make it displays in real-time with a lowest latency, so I tried to find a way to reduce the number of buffered frames (or something similar to it) before display on an UIView.
I started looking in the module MobileVLCKit but it seems no property allows me to control that.
I am wondering if the change can be accomplished on MobileVLCKit itself or on the VLC library.
If so, will I need to modify the library and rebuild it? What is the parameter should I need to change?
After spending a mount of time to look into the vlc library without successful, I tried to stream with rtsp instead of rtmp protocol and the real-time of video produced has been improved.
Thus i also found a workaround solution by setting a timer to force player moves forward the buffered frames. It might cause jagging but keep video in more real-time.
I am interested in recording media using an AVCaptureSession in iOS while playing media back using an AVPlayer (specifically, I am playing back audio and recording video, but I'm not sure it matters).
The problem is, when I play the resulting media back together later, they are out of sync. Is it possible to synchronize them, either by ensuring that playback and recording start simultaneously, or by discovering what the offset is between them? I probably need the sync to be on the order of 10 ms. It is unreasonable to assume that I can always capture audio (since the user may use headphones), so syncing via analysis of original and recorded audio is not an option.
This question suggests that it's possible to end playback and recording simultaneously and determine the initial offset from the resulting lengths that way, but I'm unclear how to get them to end simultaneously. I have two cases: 1) the audio playback runs out, and 2), the user hits the "stop recording" button.
This question suggests priming and then applying a fixed, but possibly device-dependent delay, which is obviously a hack, but if it's good enough for audio it's obviously worth considering for video.
Is there another media layer I can use to perform the required synchronization?
Related: this question is unanswered.
If you are specifically using AVPlayer to playback Audio and i would suggest you to use AudioQueueServices for the same. Its seamless and fast as it reads buffer by buffer and play pause is faster than AVPLlayer
There can also be the possibility that you are missing the initial statement of [avPlayer prepareToPlay] which might be causing much overhead for it to sync before playing the Audio.
Hope it helps you.