have been working on the AVMutableComposition to mix audio file with video,
For the part of insert the audio at video time 0, am using this
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
My challenge now is to let the user pick the video time range he want the audio in !! have no idea how this works with the CMTimeMake and if there is any smoothy picker already done
Thanks for helping !!
CMTimeMake(value,timescale)
value - as usual, amount of quantums (for example, seconds)
timescale - length of this quantum in seconds
CMTimeMake(1,30) // one interval of 30 sec
CMTimeMake(30,1) // 30 intervals of 1 sec
In fact it is the same absolute time, But it has different granularity, which is important when you deal with audio and video file processing.
Related
I am using scaleTimeRange:toDuration: to produce a fast-motion effect of upto 10x the original video speed. However I have noticed that videos with a higher bit-rate (say 20MBits/s and above) begin to stutter when played through an AVPlayer for anything above 4x the normal speed, enough to crash the AVPlayerLayer (disappear) if it runs for a while.
The code.
//initialize the player
self.player = [[AVPlayer alloc] init];
//load up the asset
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[[NSBundle mainBundle] URLForResource:#"sample-video" withExtension:#"mov"] options:nil];
[asset loadValuesAsynchronouslyForKeys:#[#"playable", #"hasProtectedContent", #"tracks"] completionHandler:
^{
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
// composition set to play at 60fps
videoComposition.frameDuration = CMTimeMake(1,60);
//add video track to composition
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
//1080p composition
CGSize renderSize = CGSizeMake(1920.0, 1080.0);
CMTime currentTime = kCMTimeZero;
CGFloat scale = 1.0;
AVAssetTrack *assetTrack = nil;
assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetTrack atTime:currentTime error:nil];
CMTimeRange scaleTimeRange = CMTimeRangeMake(currentTime, asset.duration);
//Speed it up to 8x.
CMTime scaledDuration = CMTimeMultiplyByFloat64(asset.duration,1.0/8.0);
[videoTrack scaleTimeRange:scaleTimeRange toDuration:scaledDuration];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
//ensure video is scaled up/down to match the composition
scale = renderSize.width/assetTrack.naturalSize.width;
[layerInstruction setTransform:CGAffineTransformMakeScale(scale, scale) atTime:currentTime];
AVMutableVideoCompositionInstruction *videoInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
videoInstruction.layerInstructions = #[ layerInstruction];
videoComposition.instructions = #[videoInstruction];
videoComposition.renderSize = renderSize;
//pass the stuff to AVPlayer for playback
self.playerItem = [AVPlayerItem playerItemWithAsset:composition];
self.playerItem.videoComposition = videoComposition;
[self.player replaceCurrentItemWithPlayerItem:self.playerItem];
//playerView is a custom view with AVPlayerLayer, picked up from https://developer.apple.com/library/mac/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html#//apple_ref/doc/uid/TP40010188-CH3-SW11
[self.playerView setPlayer:self.player];
//call [self.player play] when ready.
}];
Some notes:
All testing done on iPhone 6
I am deliberately not adding any audio tracks to rule out the possibility of audio playing a part here.
Normal bitrate videos (16Mbits/s on average) play fine on 10x
Same composition code produces a smooth playback on an OSX application
The stutter is more obvious with higher bitrates.
All videos being tested are 1080p 60fps
A high-bitrate video behaves well if opened in and exported to 1080, so to tone-down the bit-rate and maintain FPS.
There is no rendering/exporting of video involved.
Has anyone else ran into this and know a way around?
We have a video player where we play videos inside an AVPlayer (1GB of content in about 8MB .mov files in size). We load the AVPlayer using an AVMutableComposition of a video track and audio track that are on local disk bundled with the app.
We do something like:
AVAsset* videoAsset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];
AVAsset* voiceAsset = useVoice ? [[AVURLAsset alloc] initWithURL:voiceUrl options:nil] : nil;
AVMutableComposition* composition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack* videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack* audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack* voiceTrack = useVoice ? [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid] : nil;
NSError* error = nil;
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] firstObject] atTime:kCMTimeZero error:&error];
if (error) {
[[MNGAppDelegate sharedManagers].errorManager presentError:error];
}
if ([videoAsset tracksWithMediaType:AVMediaTypeAudio].count > 0) {
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] firstObject] atTime:kCMTimeZero error:&error];
if (error) {
[[MNGAppDelegate sharedManagers].errorManager presentError:error];
}
}
if (useVoice) {
[voiceTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, voiceAsset.duration) ofTrack:[[voiceAsset tracksWithMediaType:AVMediaTypeAudio] firstObject] atTime:kCMTimeZero error:&error];
if (error) {
[[MNGAppDelegate sharedManagers].errorManager presentError:error];
}
}
And we load it using a replaceCurrentItemWithPlayerItem (except for the first one).
[self.player replaceCurrentItemWithPlayerItem:nextItem];
We never create a playlist or can go back. We simply replace it when a new video needs to be played.
What we're noticing is that VM Tracker is showing our Dirty Size is going crazy. Once we play the first 8MB file we approach about 80MB of dirty. As we replace more and more videos we can easily get our Dirty Size to 200MB+. Within about 20-30 videos the app will usually be killed and we get a Low Memory Crash Log.
Is there something special we should be doing to reduce the memory of AVPlayer as we replace clips in the player?
I have found that setting:
[someAssetWriterInput setExpectsMediaDataInRealTime:NO];
.. has some effect on memory pressure experienced during AVComposition-oriented export sessions .. it would appear to be at least one way to govern the internal memory usage within the framework ..
self.player?,pause()
self.activityIndicator?.startAnimating()
self.removeObserverPlayerItem()
let item = AVPlayerItem(url: fileurl)
player?.replaceCurrentItem(with: item)
self.addObserverPlayerItem()
self.player?.play()
This will controll your memory and will take only needed memory. this resolved my problem.
I've created an AVMutableComposition that consists of a bunch of audio tracks that start at specific times. From there, following Apple recommendations, i turned it into an AVComposition before playing it with AVPlayer.
It all works fine playing this AVPlayer item, but if I pause it and then continue, all the tracks in the composition appear to slip back about 0.2 seconds relative to each other (i.e., they bunch up). Hitting pause and continuing several times compounds the effect and the overlap is more significant (basically if I hit it enough, I will end up with all 8 tracks playing simultaneously).
if (self.player.rate > 0.0) {
//if player is playing, pause
[self.player pause];
} else {
if (self.player) {
[self.player play];
return;
}
*/CODE CREATING COMPOSITION - missed out big chunk of code relating to finding the track and retrieving its position and scale/*
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES]
forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *sourceAsset = [[AVURLAsset alloc] initWithURL:url options:options];
//calculate times
NSNumber *time = [soundArray1 objectAtIndex:1]; //this is the time scale - e.g. 96 or 120 etc.
double timenow = [time doubleValue];
double insertTime = (240*y);
AVMutableCompositionTrack *track =
[composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
//insert the audio track from the asset into the track added to the mutable composition
AVAssetTrack *myTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
CMTimeRange myTrackRange = myTrack.timeRange;
NSError *error = nil;
[track insertTimeRange:myTrackRange
ofTrack:myTrack
atTime:CMTimeMake(insertTime, timenow)
error:&error];
[sourceAsset release];
}
}
AVComposition *immutableSnapshotOfMyComposition = [composition copy];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:immutableSnapshotOfMyComposition];
self.player = [[AVPlayer alloc] initWithPlayerItem:playerItem];
NSLog(#"here");
[self.player play];
Thanks
OK, this feels a little hacky, but it definitely works if anybody is stuck. If someone has a better answer, do let me know!
Basically, I just save the player.currentTime of the track when I hit pause and remake the track when i hit play, just starting from the point at which i paused it. No discernible delay, but I'd still be happier without wasting this extra processing.
Make sure you properly release your player item after you hit pause, otherwise you'll end up with a giant stack of AVPlayers!
I have a solution that is a bit less hacky but still hacky.
The solution comes from the fact that I noticed that if you seeked on the player, the latency between audio and video introduced by pausing disappeared.
Hence: just save the player.currentTime just before pausing and, player seekToTime just before playing again. It works pretty well on iOS 6, haven't tested on other versions yet.
This question is quite related to AVMutableComposition - Blank/Black frame between videos assets but as I am not using an AVAssetExportSession the answers doesn't fit my problem.
I'm using an AVMutableComposition to create a video composition and I'm reading it using an AVAssetReader (I need to have the frame data, I can't use a AVPlayer) but I often have black frames between my video chunks (there is no glitch noticeable in the audio).
I create my Composition as
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
NSMutableArray* durationList = [NSMutableArray array];
NSMutableArray* videoList= [NSMutableArray array];
NSMutableArray* audioList= [NSMutableArray array];
for (NSInteger i = 0; i < [clips count]; i++)
{
AVURLAsset *myasset = [clips objectAtIndex:i];
AVAssetTrack *clipVideoTrack = [[myasset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[videoList addObject:clipVideoTrack];
AVAssetTrack *clipAudioTrack = [[myasset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[audioList addObject:clipAudioTrack];
CMTime clipDuration = [myasset duration];
CMTimeRange clipRange = CMTimeRangeMake(kCMTimeZero, clipDuration);
[durationList addObject:[NSValue valueWithCMTimeRange:clipRange]];
}
[compositionVideoTrack insertTimeRanges:durationList ofTracks:videoList atTime:kCMTimeZero error:nil];
[compositionAudioTrack insertTimeRanges:durationList ofTracks:audioList atTime:kCMTimeZero error:nil];
I also tried to insert each track manually in my composition but I have the same phenomenon.
Thanks
I managed to solve this problem after many hours of testing. There are two things I needed to change.
1) add the 'precise' option when creating the asset.
NSDictionary *options = #{AVURLAssetPreferPreciseDurationAndTimingKey:#YES};
AVURLAsset *videoAsset3 = [AVURLAsset URLAssetWithURL:clip3URL options:options];
2) don't use
CMTime duration3 = [videoAsset3 duration];
use instead
AVAssetTrack *videoAsset3Track = [[videoAsset3 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CMTime duration3 = videoAsset3Track.timeRange.duration;
I found this out after I set the AVPlayer background color to Blue, then I noticed blue frames appearing, so the problem was to do with timings. Once I changed to above settings, the different videos aligned up fine when using:
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, duration3)
ofTrack:videoAsset3Track
atTime:kCMTimeZero error:&error];
Try creating two video tracks, and then alternate between the two when adding the clips.
calculate Last CMTime is very important to insertTimeRange: for new AVAsset
I had the same problem and solve it with:
func addChunk(media: AVAsset) throws {
let duration = self.mutableComposition.tracks.last?.timeRange.end
try mutableComposition.insertTimeRange(CMTimeRange(start: .zero, duration: media.duration), of: media, at: duration ?? .zero)
}
I'm currently trying to put 5 videos back to back using AVMutableComposition like so:
[mixComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset1.duration) ofAsset:asset1 atTime:[mixComposition duration] error:nil];
[mixComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset2.duration) ofAsset:asset2 atTime:[mixComposition duration] error:nil];
[mixComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset3.duration) ofAsset:asset3 atTime:[mixComposition duration] error:nil];
[mixComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset4.duration) ofAsset:asset4 atTime:[mixComposition duration] error:nil];
[mixComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset5.duration) ofAsset:asset5 atTime:[mixComposition duration] error:nil];
I then use an AVAssetExportSession to export the video, which works however between each video I'm getting a blank/black frame which I need to remove. Has anyone had this problem before and if so did you manage to fix it?
Also, the blank frames aren't in the source video files.
Thanks in advance.
I had the same problem the other day.
If you got your assets(asset1, asset2, etc.) by exporting them from other asset which was created using again insertTimeRange, then it's the same case.
The problem is that when you export assets created using insertTimeRange, the export doesn't go correct, when you attach such videos black frames appear between them.
Try using "timeRange" option of the AVAssetExportSession, and cut the range you need from the main asset. Then your assets will attach correct.
I just met the same problem. Turn out the solution is using AVMutableCompositionTrack and compositing the video tracks, something like this.
AVMutableCompositionTrack * videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:composition.duration error:&error];
For the audio part I think you must add a separate track with AVMediaTypeAudio type.