What is the default maximum and minimum buffer size of iOS AVPlayer? How to increasing its size for mp3 streaming? thank you
If your buffer size means playback buffer, then I believe this size is undocumented.
This behaviour is provided by AVFoundation, quoted on Advances in AVFoundation Playback
And we see that playback starts but then we quickly run into a stall because we didn't have enough buffer to play to the end.
In that case, we'll go into the waiting state and re-buffer until we have enough to play through.
They just mentioned enough not the exact time or size, it make sense that is a dynamic unit of measurement according to current network situation, but who knows.
Back to the topic, if you would like to take control of buffering.
You can observe AVPlayerItem's loadedTimeRanges property.
below code snippet had 5 seconds buffer time:
if ([keyPath isEqualToString:#"loadedTimeRanges"]) {
NSArray *timeRanges = (NSArray *)[change objectForKey:NSKeyValueChangeNewKey];
if (timeRanges && [timeRanges count]) {
CMTimeRange timerange = [[timeRanges objectAtIndex:0] CMTimeRangeValue];
if (self.audioPlayer.rate == 0 && !pauseReasonForced) {
CMTime bufferdTime = CMTimeAdd(timerange.start, timerange.duration);
CMTime milestone = CMTimeAdd(self.audioPlayer.currentTime, CMTimeMakeWithSeconds(5.0f, timerange.duration.timescale));
if (CMTIME_COMPARE_INLINE(bufferdTime , >, milestone) && self.audioPlayer.currentItem.status == AVPlayerItemStatusReadyToPlay && !interruptedWhilePlaying && !routeChangedWhilePlaying) {
if (![self isPlaying]) {
[self.audioPlayer play];
}
}
}
}
}
Related
Im doing a custom camera to film at Full HD or just HD quality. The issue is that after I set the Camera to 25 Frames with the follow code:
- (void) setFrameRate:(AVCaptureDevice*) camera {
NSError *error;
if (![camera lockForConfiguration:&error]) {
NSLog(#"Could not lock device %# for configuration: %#", camera, error);
return;
}
AVCaptureDeviceFormat *format = camera.activeFormat;
double epsilon = 0.00000001;
int desiredFrameRate = 25;
for (AVFrameRateRange *range in format.videoSupportedFrameRateRanges) {
if (range.minFrameRate <= (desiredFrameRate + epsilon) &&
range.maxFrameRate >= (desiredFrameRate - epsilon)) {
[camera setActiveVideoMaxFrameDuration:CMTimeMake(10, desiredFrameRate*10)];
[camera setActiveVideoMinFrameDuration:CMTimeMake(10, desiredFrameRate*10)];
break;
}
}
[camera unlockForConfiguration];
}
It changes the video fps but not to exactly 25 frames per second like I set in method. It fluctuate between 23.93 and 25.50 frames per second.
Anyone knows why?
After several attempts and debugging I find out that the issue with the frame rate not being exactly 25 frame has to do with the recording method and not wiht the device setup.
I was using AVAssetWriter object to record the video like the example showed in the follow link (https://reformatcode.com/code/ios/ios-8-film-from-both-back-and-front-camera).
But in no way was possible to get the exactly 25 fps.
Change the object for recording video for AVCaptureMovieFileOutput and was supper easy from there setting up and recording. Result its much more precise, between 25 and 25.01.
I want MPMusicPlayerController.applicationMusicPlayer() instance to start playing from a specific starting time:
applicationMusicPlayer.setQueueWithStoreIDs([String(track.id)])
applicationMusicPlayer.currentPlaybackTime = 10.0
print(applicationMusicPlayer.currentPlaybackTime) // will print 10.0
But as soon as the player starts playing the item, it will reset its currentPlaybackTime to zero and will start from the beginning:
applicationMusicPlayer.play()
print(applicationMusicPlayer.currentPlaybackTime) // will print 0.0
I thought maybe it's because I set the playback to the player that has just been created and is not ready yet, but neither .prepareToPlay() or .isPreparedToPlay() help me with that situation.
I even tried to wait for a few seconds and only then start the playback from the position that I've set. No success at all, extremely frustrating.
Maybe it is somehow connected to the fact that I'm playing songs from Apple Music directly? I can't switch to AVPlayer 'cause I have to play music from Apple Music.
Would appreciate any thoughts and help!
UPDATE: I found the method beginSeekingBackward at the MPMediaPlayback and it says that if the content is streamed, it won't have any effect:
https://developer.apple.com/documentation/mediaplayer/mpmediaplayback/1616248-beginseekingbackward?language=objc
Seems like only built-in Music app has a control over streaming music playback?
I had just run into this exact problem. I managed to get around it with the follow code.
It creates a background thread that continuously checks the currentPlaybackTime of the player. As soon as currentPlaybackTime is not the time I set it to be I set currentPlaybackTime back to what I wanted.
It feels like a terrible hack but it's working for me so far.
MPMusicPlayerController *player = [MPMusicPlayerController systemMusicPlayer];
player.currentPlaybackTime = _startTime;
[player play];
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0ul);
dispatch_async(queue, ^{
while(true) {
if (player.currentPlaybackTime != _startTime) {
player.currentPlaybackTime = _startTime;
break;
}
}
});
I am creating custom video player using AVPlayer in ios (OBJECTIVE-C).I have a settings button which on clicking will display the available video dimensions and audio formats.
Below is the design:
so,I want to know:
1).How to get the available dimensions from a video url(not a local video)?
2). Even if I am able to get the dimensions,Can I switch between the available dimensions while playing in AVPlayer?
Can anyone give me a hint?
If it is not HLS (streaming) video, you can get Resolution information with the following code.
Sample code:
// player is playing
if (_player.rate != 0 && _player.error == nil)
{
AVAssetTrack *track = [[_player.currentItem.asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
if (track != nil)
{
CGSize naturalSize = [track naturalSize];
naturalSize = CGSizeApplyAffineTransform(naturalSize, track.preferredTransform);
NSInteger width = (NSInteger) naturalSize.width;
NSInteger height = (NSInteger) naturalSize.height;
NSLog(#"Resolution : %ld x %ld", width, height);
}
}
However, for HLS video, the code above does not work.
I have solved this in a different way.
When I played a video, I got the image from video and calculated the resolution of that.
Here is the sample code:
// player is playing
if (_player.rate != 0 && _player.error == nil)
{
AVAssetTrack *track = [[_player.currentItem.asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
CMTime currentTime = _player.currentItem.currentTime;
CVPixelBufferRef buffer = [_videoOutput copyPixelBufferForItemTime:currentTime itemTimeForDisplay:nil];
NSInteger width = CVPixelBufferGetWidth(buffer);
NSInteger height = CVPixelBufferGetHeight(buffer);
NSLog(#"Resolution : %ld x %ld", width, height);
}
As you have mentioned the that it is not a local video, you can call on some web service to return the available video dimensions for that particular video. After that change the URL to other available video and seek to the current position.
Refer This
Does anyone know how to display a "This video is playing on ..." screen when Airplaying with AVPlayer? Example from the VEVO iPhone app:
By default, AVPlayer just displays a black screen. Do I have to implement such a screen myself or is a default component available for this?
May be this is a little late, but I figured out or at least a workaround for this. I added a UILabel, and to obtain the name of the selected device by using doing something like this:
CFDictionaryRef description;
UInt32 dataSize = sizeof(description);
if (AudioSessionGetProperty(kAudioSessionProperty_AudioRouteDescription, &dataSize, &description) == kAudioSessionNoError) {
CFArrayRef outputs = CFDictionaryGetValue(description, kAudioSession_AudioRouteKey_Outputs);
if (outputs) {
if(CFArrayGetCount(outputs) > 0) {
CFDictionaryRef currentOutput = CFArrayGetValueAtIndex(outputs, 0);
NSLog(#"%#", currentOutput);
CFStringRef outputType = CFDictionaryGetValue(currentOutput, kAudioSession_AudioRouteKey_Type);
if (CFStringCompare(outputType, kAudioSessionOutputRoute_AirPlay, 0) == kCFCompareEqualTo) {
NSDictionary *desc = (__bridge NSDictionary *)(currentOutput);
NSLog(#"%#", [desc objectForKey:#"RouteDetailedDescription_Name"]);
}
}
}
}
There should be a better way, but this is a good approach. Also, AudioSessionGetProperty is deprecated, and this can be done using AVAudioSession.
Hope this helps
That image is displayed by default when using the MPMoviePlayerController. Since AVPlayer doesn't have an UI, that image is also not available if MPMoviePlayerController is not used.
Also, I don't think that the image is available as a component outside of the MPMoviePlayerController.
I'm developing an iOS app that gets UIImages at random times from an internet connection, and progressively constructs a video file from them as the images come in. I got it working a little, but the fact that the images dont arrive at the same rate all the time is messing up the video.
How do I re-calculate CMTime when each new UIImage arrives so that it adjusts for the varying frame rate of the arriving UIImages, which can arrive anywhere from milliseconds to seconds apart??
Here is what I'm doing so far, some code is not shown, but here is the basic thing
.
.
adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput: videoStream
sourcePixelBufferAttributes:attributes];
CMTime frameTime=CMTimeMake(1,10); // assumed initial frame rate
.
.
-(void)addImageToMovie:(UIImage*)img {
append_ok=FALSE;
buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:img.size];
while (!append_ok) {
if (adaptor.assetWriterInput.readyForMoreMediaData){
frameTime.value +=1;
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
[NSThread sleepForTimeInterval:0.01];
} else {
[NSThread sleepForTimeInterval:0.01];
}
}
if(buffer) {
CVBufferRelease(buffer);
}
}
It depends on the number of frames. Instead of adding 1, add 10 to frameTime.value.