The Problem:
During every playback, the audio is between 1-2 seconds behind the video.
The Setup:
The assets are loaded with AVURLAssets from a media stream.
To write the composition, I'm using AVMutableCompositions and AVMutableCompositionTracks with asymmetric timescales. The audio and video are both streamed to the device. The timescale for audio is 44100; the timescale for video is 600.
The playback is done with AVPlayer.
Attempted Solutions:
Using videoAssetTrack.timeRange for [composition insertTimeRange].
Using CMTimeRangeMake(kCMTimeZero, videoAssetTrack.duration);
Using CMTimeRangeMake(kCMTimeZero, videoAssetTrack.timeRange.duration);
The Code:
+(AVMutableComposition*)overlayAudio:(AVURLAsset*)audioAsset
withVideo:(AVURLAsset*)videoAsset
{
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVAssetTrack* audioTrack = [self getTrackFromAsset:audioAsset withMediaType:AVMediaTypeAudio];
AVAssetTrack* videoTrack = [self getTrackFromAsset:videoAsset withMediaType:AVMediaTypeVideo];
CMTime duration = videoTrack.timeRange.duration;
AVMutableCompositionTrack* audioComposition = [self composeTrack:audioTrack withComposition:mixComposition andDuration:duration andMedia:AVMediaTypeAudio];
AVMutableCompositionTrack* videoComposition = [self composeTrack:videoTrack withComposition:mixComposition andDuration:duration andMedia:AVMediaTypeVideo];
[self makeAssertionAgainstAudio:audioComposition andVideo:videoComposition];
return mixComposition;
}
+(AVAssetTrack*)getTrackFromAsset:(AVURLAsset*)asset withMediaType:(NSString*)mediaType
{
return [[asset tracksWithMediaType:mediaType] objectAtIndex:0];
}
+(AVAssetExportSession*)configureExportSessionWithAsset:(AVMutableComposition*)composition toUrl:(NSURL*)url
{
AVAssetExportSession* exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
exportSession.outputFileType = #"com.apple.quicktime-movie";
exportSession.outputURL = url;
exportSession.shouldOptimizeForNetworkUse = YES;
return exportSession;
}
-(IBAction)playVideo
{
[avPlayer pause];
avPlayerItem = [AVPlayerItem playerItemWithAsset:mixComposition];
avPlayer = [[AVPlayer alloc]initWithPlayerItem:avPlayerItem];
avPlayerLayer =[AVPlayerLayer playerLayerWithPlayer:avPlayer];
[avPlayerLayer setFrame:CGRectMake(0, 0, 305, 283)];
[avPlayerLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[playerView.layer addSublayer:avPlayerLayer];
[avPlayer seekToTime:kCMTimeZero];
[avPlayer play];
}
Comments:
I don't understand much of the AVFoundation framework. It is entirely probable that I am simply misusing the snippets I have provided. (i.e. why "insertTimeRange" for composition?)
I can provide any other information needed for resolution - including debug asset track property values, network telemetry, streaming information, etc.
If it's consistent, it appears there is an enforced delay to sample the audio properly.
Apple's guides are usually easier to read than their accompanying books, however here is the specific note on delay.
https://developer.apple.com/library/ios/technotes/tn2258/_index.html
The programming guides will detail why/what.
Related
I'm trying to create an AVMutableComposition and playing it using AVPlayer. This is what I'm doing.
//Define the AVComposition
self.composition = [[AVMutableComposition alloc] init];
//Define Mutable Track
self.mainVideoTrack = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
//Define the asset
AVAsset *asset = [AVAsset assetWithURL:url];
//Insert the asset in the track
[self.mainVideoTrack insertTimeRange:CMTimeRangeFromTimeToTime(CMTimeMake(0,1000),CMTimeMake(asset.duration*1000,1000)) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:[self startingTimeForSegment:videoSegment] error:&error];
//Define player item with the composition
self.playerItem = [[AVPlayerItem alloc] initWithAsset:self.composition];
//Define the player, player layer & add the layer
self.avPlayer = [[AVPlayer alloc]initWithPlayerItem:self.playerItem];
self.layerVideo = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer];
self.layerVideo.frame = self.previewContainer.bounds;
[self.previewContainer.layer addSublayer:self.layerVideo];
However I can hear no sound in the player layer. If I initialize the playeritem directly with the asset without using AVMutableComposition, it plays the sound. What am I doing wrong? Stuck on this.
It looks like you are only adding one track to your AVMutableComposition. You would need to add the Audio track from the asset as well. This can be done exactly the same way as adding a video track.
self.mainAudioTrack = [self.composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
And then you can insert from your assets audio track. This is assuming you want your audio to start at the same point your video does.
[self.mainAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:[self startingTimeForSegment:audio] error:&error];
I am trying to play a video which is stored on my device using the following code. The problem is that I am seeing nothing on the screen, just a white screen.
The value of filePath in this case is file:///var/mobile/Containers/Data/Application/0E87EF09-B87C-443F-9BFB-E9A68AC4A162/Documents/TMRecordedVideos/2015-07-25T18-57-19Z.mov
NSString *filePath = [self.video fileURL];
NSURL *fileUrl = [NSURL fileURLWithPath:filePath isDirectory:NO];
AVAsset *videoAsset = [AVAsset assetWithURL:fileUrl];
AVPlayerItem *playerItem = [[AVPlayerItem alloc] initWithAsset:videoAsset];
self.videoPlayer = [AVPlayer playerWithPlayerItem:playerItem];
self.videoPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:self.videoPlayer];
[self.videoPlayerLayer setFrame:self.view.frame];
[self.view.layer addSublayer:self.videoPlayerLayer];
[self.videoPlayer play];
I am using AVPlayerLayer for playing video. I used the below code and it worked fine. The URL(in my case outputFileURL) is similar to your filePath.
AVPlayer *avPlayerq = [AVPlayer playerWithURL:outputFileURL];
avPlayerq.actionAtItemEnd = AVPlayerActionAtItemEndNone;
AVPlayerLayer *videoLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayerq];
videoLayer.frame= self.view.bounds;
videoLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:videoLayer];
[avPlayerq play];
Hope this may help you to sort out your issue.
I am using scaleTimeRange:toDuration: to produce a fast-motion effect of upto 10x the original video speed. However I have noticed that videos with a higher bit-rate (say 20MBits/s and above) begin to stutter when played through an AVPlayer for anything above 4x the normal speed, enough to crash the AVPlayerLayer (disappear) if it runs for a while.
The code.
//initialize the player
self.player = [[AVPlayer alloc] init];
//load up the asset
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[[NSBundle mainBundle] URLForResource:#"sample-video" withExtension:#"mov"] options:nil];
[asset loadValuesAsynchronouslyForKeys:#[#"playable", #"hasProtectedContent", #"tracks"] completionHandler:
^{
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
// composition set to play at 60fps
videoComposition.frameDuration = CMTimeMake(1,60);
//add video track to composition
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
//1080p composition
CGSize renderSize = CGSizeMake(1920.0, 1080.0);
CMTime currentTime = kCMTimeZero;
CGFloat scale = 1.0;
AVAssetTrack *assetTrack = nil;
assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetTrack atTime:currentTime error:nil];
CMTimeRange scaleTimeRange = CMTimeRangeMake(currentTime, asset.duration);
//Speed it up to 8x.
CMTime scaledDuration = CMTimeMultiplyByFloat64(asset.duration,1.0/8.0);
[videoTrack scaleTimeRange:scaleTimeRange toDuration:scaledDuration];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
//ensure video is scaled up/down to match the composition
scale = renderSize.width/assetTrack.naturalSize.width;
[layerInstruction setTransform:CGAffineTransformMakeScale(scale, scale) atTime:currentTime];
AVMutableVideoCompositionInstruction *videoInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
videoInstruction.layerInstructions = #[ layerInstruction];
videoComposition.instructions = #[videoInstruction];
videoComposition.renderSize = renderSize;
//pass the stuff to AVPlayer for playback
self.playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.playerItem.videoComposition = videoComposition;
[self.player replaceCurrentItemWithPlayerItem:self.playerItem];
//playerView is a custom view with AVPlayerLayer, picked up from https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html#//apple_ref/doc/uid/TP40010188-CH3-SW11
[self.playerView setPlayer:self.player];
//call [self.player play] when ready.
}];
Some notes:
All testing done on iPhone 6
I am deliberately not adding any audio tracks to rule out the possibility of audio playing a part here.
Normal bitrate videos (16Mbits/s on average) play fine on 10x
Same composition code produces a smooth playback on an OSX application
The stutter is more obvious with higher bitrates.
All videos being tested are 1080p 60fps
A high-bitrate video behaves well if opened in and exported to 1080, so to tone-down the bit-rate and maintain FPS.
There is no rendering/exporting of video involved.
Has anyone else ran into this and know a way around?
I am implementing a n HTTP live streaming player with OSX using avplayer.
I am able to stream it properly seek and get duration timing etc.
Now i want to take screen shots and process the frames from it using OpenCV.
I went for using AVASSetImageGenerator. But there is no audio and video tracks with the AVAsset which is associated with player.currentItem.
The tracks are appearing in player.currentItem.tracks.
So i am not able to sue AVAssetGenerator. Can anybody help to find out a solution to extract screenshots and individual frames in such a scenario?
Please find the code below how i am initiating an HTTP live stream
Thanks in advance.
NSURL* url = [NSURL URLWithString:#"http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8"];
playeritem = [AVPlayerItem playerItemWithURL:url];
[playeritem addObserver:self forKeyPath:#"status" options:0 context:AVSPPlayerStatusContext];
[self setPlayer:[AVPlayer playerWithPlayerItem:playeritem]];
[self addObserver:self forKeyPath:#"player.rate" options:NSKeyValueObservingOptionNew context:AVSPPlayerRateContext];
[self addObserver:self forKeyPath:#"player.currentItem.status" options:NSKeyValueObservingOptionNew context:AVSPPlayerItemStatusContext];
AVPlayerLayer *newPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:[self player]];
[newPlayerLayer setFrame:[[[self playerView] layer] bounds]];
[newPlayerLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable];
[newPlayerLayer setHidden:YES];
[[[self playerView] layer] addSublayer:newPlayerLayer];
[self setPlayerLayer:newPlayerLayer];
[self addObserver:self forKeyPath:#"playerLayer.readyForDisplay" options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew context:AVSPPlayerLayerReadyForDisplay];
[self.player play];
Following is how i am checking whether video track is present with the Asset
case AVPlayerItemStatusReadyToPlay:
[self setTimeObserverToken:[[self player] addPeriodicTimeObserverForInterval:CMTimeMake(1, 10) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) {
[[self timeSlider] setDoubleValue:CMTimeGetSeconds(time)];
NSLog(#"%f,%f,%f",[self currentTime],[self duration],[[self player] rate]);
AVPlayerItem *item = playeritem;
if(item.status == AVPlayerItemStatusReadyToPlay)
{
AVAsset *asset = (AVAsset *)item.asset;
long audiotracks = [[asset tracks] count];
long videotracks = [[asset availableMediaCharacteristicsWithMediaSelectionOptions]count];
NSLog(#"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
}
}]];
AVPlayerItem *item = self.player.currentItem;
if(item.status != AVPlayerItemStatusReadyToPlay)
return;
AVURLAsset *asset = (AVURLAsset *)item.asset;
long audiotracks = [[asset tracksWithMediaType:AVMediaTypeAudio]count];
long videotracks = [[asset tracksWithMediaType:AVMediaTypeVideo]count];
NSLog(#"Track info Audio = %ld,Video=%ld",audiotracks,videotracks);
This is an older question but in case someone needs help for that i have an answer
AVURLAsset *asset = /* Your Asset here! */;
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter = kCMTimeZero;
generator.requestedTimeToleranceBefore = kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) * /* Put the FPS of the source video here */ ; i++){
#autoreleasepool {
CMTime time = CMTimeMake(i, /* Put the FPS of the source video here */);
NSError *err;
CMTime actualTime;
CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
// Do what you want with the image, for example save it as UIImage
UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
CGImageRelease(image);
}
}
You can easily get the FPS of a Video by using this code:
float fps=0.00;
if (asset) {
AVAssetTrack * videoATrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
if(videoATrack)
{
fps = [videoATrack nominalFrameRate];
}
}
Hope that helps someone who is asking how to get all frames from a video or just some specific (with CMTime for example) frames. Please bear in mind, that saving all frames to an array can impact the memory hardly!
I am playing live streaming video in iPhone using AVPlayer. I want to switch off the volume of the player (programmatically). I tried this http://developer.apple.com/library/ios/#qa/qa1716/_index.html. But it is not working in my case.
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[self myAssetURL] options:nil];
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
// Mute all the audio tracks
NSMutableArray *allAudioParams = [NSMutableArray array];
for (AVAssetTrack *track in audioTracks) {
AVMutableAudioMixInputParameters *audioInputParams =[AVMutableAudioMixInputParameters audioMixInputParameters];
[audioInputParams setVolume:0.0 atTime:kCMTimeZero];
[audioInputParams setTrackID:[track trackID]];
[allAudioParams addObject:audioInputParams];
}
AVMutableAudioMix *audioZeroMix = [AVMutableAudioMix audioMix];
[audioZeroMix setInputParameters:allAudioParams];
// Create a player item
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
[playerItem setAudioMix:audioZeroMix]; // Mute the player item
// Create a new Player, and set the player to use the player item
// with the muted audio mix
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
// assign player object to an instance variable
self.mPlayer = player;
// play the muted audio
[mPlayer play];
Please suggest any solution for this.
Thanks
You can use MPVolumeView class, available [here]. It has a slider for volume control, that works for HTTP Live Streams as well.(http://developer.apple.com/library/ios/#documentation/mediaplayer/reference/MPVolumeView_Class/Reference/Reference.html)