AVAssetExportSession - The video could not be composed - ios

I am trying to do some basic Video Compositions in Xamarin / Monotouch and am having some success but am stuck what seems to be a rather simple task.
I record videos from the camera in portrait so I use AVAssetExportSession to rotate the videos. I have created a layer instructions to rotate the video which works fine. I am able to successfully export the video in the correct orientation.
The Issue:
When I add the audio track into the export I always get a failed response with this error:
Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo=0x1912c320 {NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}
If I dont set the videoComposition property on the exportSession the audio and video export perfectly fine just with the wrong orientation. If anyone could give mem some advice it would be greatly appreciated. Below is my code:
var composition = new AVMutableComposition();
var compositionTrackAudio = composition.AddMutableTrack(AVMediaType.Audio, 0);
var compositionTrackVideo = composition.AddMutableTrack(AVMediaType.Video, 0);
var videoCompositionInstructions = new AVVideoCompositionInstruction[files.Count];
var index = 0;
var renderSize = new SizeF(480, 480);
var _startTime = CMTime.Zero;
//AVUrlAsset asset;
var asset = new AVUrlAsset(new NSUrl(file, false), new AVUrlAssetOptions());
//var asset = AVAsset.FromUrl(new NSUrl(file, false));
//create an avassetrack with our asset
var videoTrack = asset.TracksWithMediaType(AVMediaType.Video)[0];
var audioTrack = asset.TracksWithMediaType(AVMediaType.Audio)[0];
//create a video composition and preset some settings
NSError error;
var assetTimeRange = new CMTimeRange { Start = CMTime.Zero, Duration = asset.Duration };
compositionTrackAudio.InsertTimeRange(new CMTimeRange
{
Start = CMTime.Zero,
Duration = asset.Duration,
}, audioTrack, _startTime, out error);
if (error != null) {
Debug.WriteLine (error.Description);
}
compositionTrackVideo.InsertTimeRange(assetTimeRange, videoTrack, _startTime, out error);
//create a video instruction
var transformer = new AVMutableVideoCompositionLayerInstruction
{
TrackID = videoTrack.TrackID,
};
var audioMix = new AVMutableAudioMix ();
var mixParameters = new AVMutableAudioMixInputParameters{
TrackID = audioTrack.TrackID
};
mixParameters.SetVolumeRamp (1.0f, 1.0f, new CMTimeRange {
Start = CMTime.Zero,
Duration = asset.Duration
});
audioMix.InputParameters = new [] { mixParameters };
var t1 = CGAffineTransform.MakeTranslation(videoTrack.NaturalSize.Height, 0);
//Make sure the square is portrait
var t2 = CGAffineTransform.Rotate(t1, (float)(Math.PI / 2f));
var finalTransform = t2;
transformer.SetTransform(finalTransform, CMTime.Zero);
//add the transformer layer instructions, then add to video composition
var instruction = new AVMutableVideoCompositionInstruction
{
TimeRange = assetTimeRange,
LayerInstructions = new []{ transformer }
};
videoCompositionInstructions[index] = instruction;
index++;
_startTime = CMTime.Add(_startTime, asset.Duration);
var videoComposition = new AVMutableVideoComposition();
videoComposition.FrameDuration = new CMTime(1 , (int)videoTrack.NominalFrameRate);
videoComposition.RenderScale = 1;
videoComposition.Instructions = videoCompositionInstructions;
videoComposition.RenderSize = renderSize;
var exportSession = new AVAssetExportSession(composition, AVAssetExportSession.PresetHighestQuality);
var filePath = _fileSystemManager.TempDirectory + DateTime.UtcNow.Ticks + ".mp4";
var outputLocation = new NSUrl(filePath, false);
exportSession.OutputUrl = outputLocation;
exportSession.OutputFileType = AVFileType.Mpeg4;
exportSession.VideoComposition = videoComposition;
exportSession.AudioMix = audioMix;
exportSession.ShouldOptimizeForNetworkUse = true;
exportSession.ExportAsynchronously(() =>
{
Debug.WriteLine(exportSession.Status);
switch (exportSession.Status)
{
case AVAssetExportSessionStatus.Failed:
{
Debug.WriteLine(exportSession.Error.Description);
Debug.WriteLine(exportSession.Error.DebugDescription);
break;
}
case AVAssetExportSessionStatus.Completed:
{
if (File.Exists(filePath))
{
_uploadService.AddVideoToVideoByteList(File.ReadAllBytes(filePath), ".mp4");
Task.Run(async () =>
{
await _uploadService.UploadVideo(_videoData);
});
}
break;
}
case AVAssetExportSessionStatus.Unknown:
{
break;
}
case AVAssetExportSessionStatus.Exporting:
{
break;
}
case AVAssetExportSessionStatus.Cancelled:
{
break;
}
}
});

So this was a really stupid mistake it was due to adding the audio track in before the video so the instructions must have been trying to apply the transform to the audio track rather than my video track.

My problem is that I forget to set the timeRange, it should be like this
let instruction = AVMutableVideoCompositionInstruction()
instruction.layerInstructions = [layer]
instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: videoDuration)
Note that AVMutableVideoCompositionInstruction.timeRange 's end time must be valid. It is different from AVAssetExportSession.timeRange
The time range to be exported from the source.
The default time range of an export session is kCMTimeZero to kCMTimePositiveInfinity, meaning that (modulo a possible limit on file length) the full duration of the asset will be exported.
You can observe this property using Key-value observing.

Related

Convert video from mov to MP4 in iOS xamarin

How to convert mov to MP4 in xamarin iOS. I am using AVAssetExportSession. Last few seconds of video gets chipped off.. Any solution for this to retain full video duration.
Here is my code.
var basePath = Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData);
var inputFilePath = Path.Combine(basePath, Path.ChangeExtension(_vm.Model.AnswerName, "mov"));
var outputFilePath = Path.Combine(basePath, _vm.Model.AnswerName);
var asset = AVAsset.FromUrl(NSUrl.FromFilename(inputFilePath));
AVAssetExportSession export = new AVAssetExportSession(asset, AVAssetExportSession.PresetHighestQuality);
export.OutputUrl = NSUrl.FromFilename(outputFilePath);
export.OutputFileType = AVFileType.Mpeg4;
export.ShouldOptimizeForNetworkUse = true;
try
{
export.ExportTaskAsync().Wait();
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine(ex);
}
var fileHelper = new FileHelper();
// If Export successful then delete the MOV file
if (fileHelper.FileExists(_vm.Model.AnswerName))
{
fileHelper.DeleteFile(Path.ChangeExtension(_vm.Model.AnswerName, "mov"));
}

How to change CreationDate Resource Value

I'm writing a recording app that enables the user to trim parts of previous recordings and concatenate them into one new recording.
My problem is: let's say I recorded an hour long track and I want to trim the first 2 minutes of that track. when I'll export these 2 minutes the creation date of this track will be "now", and I need it to match the date these 2 minutes actually took place.
So basically I'm trying to modify the tracks Url Resource Values, but I want to change only the creation date.
Is there a way to do this? or is there a way to add a new resource value key? or a way to attach the needed date to the url?
func trimStatringPoint(_ from: Date, startOffSet: TimeInterval, duration: TimeInterval, fileName: String, file: URL, completion: fileExportaionBlock?) {
if let asset = AVURLAsset(url: file) as AVAsset? {
var trimmedFileUrl = documentsDirectory().appendingPathComponent(fileName)
let exporter = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A)
exporter?.outputFileType = AVFileTypeAppleM4A
exporter?.outputURL = trimmedFileUrl
let start = CMTimeMake(Int64(startOffSet), 1)
let end = CMTimeMake(Int64(startOffSet + duration), 1)
exporter?.timeRange = CMTimeRangeFromTimeToTime(start, end)
exporter?.exportAsynchronously { handler in
if exporter?.status != AVAssetExportSessionStatus.completed {
print("Error while exporting \(exporter?.error?.localizedDescription ?? "unknown")")
completion?(nil)
return
}
}
//------------------------------------------------------
// this code needs to be replaced
do {
var resourceValus = URLResourceValues()
resourceValus.creationDate = from
try trimmedFileUrl.setResourceValues(resourceValus)
} catch {
deleteFile(atPath: trimmedFileUrl)
print("Error while setting date - \(error.localizedDescription)")
completion?(nil)
return
}
//------------------------------------------------------
completion?(trimmedFileUrl)
}
Have you tried mofifying metadata of the exported recording?
https://developer.apple.com/documentation/avfoundation/avmetadatacommonkeycreationdate
AVMutableMetadataItem *item = [AVMutableMetadataItem metadataItem];
metaItem.key = AVMetadataCommonKeyCreationDate;
metaItem.keySpace = AVMetadataKeySpaceCommon;
metaItem.value = [NSDate date];
NSArray *metadata = #{ metaItem };
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
exportSession.metadata = metadata;

Xamarin.iOS - ExportAsynchronously fails

Firstly I should mention I'm new a this (iOS and Xamarin).
I'm trying to watermark a video using a image and a text.
I've ported most of the code from the link: https://stackoverflow.com/a/22016800/5275669.
The entire code is pasted here:
try {
var videoAsset = AVUrlAsset.FromUrl (new NSUrl (filepath, false)) as AVUrlAsset;
AVMutableComposition mixComposition = AVMutableComposition.Create ();
var compositionVideoTracks = mixComposition.AddMutableTrack (AVMediaType.Video, 0);
AVAssetTrack clipVideoTrack = videoAsset.TracksWithMediaType (AVMediaType.Video) [0];
var compositionAudioTrack = mixComposition.AddMutableTrack (AVMediaType.Audio, 0);
AVAssetTrack clipAudioTrack = videoAsset.TracksWithMediaType (AVMediaType.Audio) [0];
NSError error;
CMTimeRange timeRangeInAsset = new CMTimeRange ();
timeRangeInAsset.Start = CMTime.Zero;
timeRangeInAsset.Duration = videoAsset.Duration;
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipVideoTrack, CMTime.Zero, out error);
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipAudioTrack, CMTime.Zero, out error);
compositionVideoTracks.PreferredTransform = clipVideoTrack.PreferredTransform;
CGSize sizeOfVideo = videoAsset.NaturalSize;
CATextLayer textOfvideo = (CATextLayer)CATextLayer.Create ();
textOfvideo.String = String.Format ("{0} {1}", DateTime.Now.ToLongTimeString (), "Test app");
textOfvideo.SetFont (CGFont.CreateWithFontName ("Helvetica"));
textOfvideo.FontSize = 50;
textOfvideo.AlignmentMode = CATextLayer.AlignmentCenter;
textOfvideo.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height / 6);
textOfvideo.ForegroundColor = new CGColor (255, 0, 0);
UIImage myImage = UIImage.FromFile ("Icon-Small.png");
CALayer layerCa = CALayer.Create ();
layerCa.Contents = myImage.CGImage;
layerCa.Frame = new CGRect (0, 0, 100, 100);
layerCa.Opacity = 0.65F;
CALayer optionalLayer = CALayer.Create ();
optionalLayer.AddSublayer (textOfvideo);
optionalLayer.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height);
optionalLayer.MasksToBounds = true;
CALayer parentLayer = CALayer.Create ();
CALayer videoLayer = CALayer.Create ();
parentLayer.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height);
videoLayer.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height);
parentLayer.AddSublayer (videoLayer);
parentLayer.AddSublayer (layerCa);
parentLayer.AddSublayer (textOfvideo);
AVMutableVideoComposition videoComposition = AVMutableVideoComposition.Create ();
videoComposition.RenderSize = sizeOfVideo;
videoComposition.FrameDuration = new CMTime (1, 30);
videoComposition.AnimationTool = AVVideoCompositionCoreAnimationTool.FromLayer (videoLayer, parentLayer);
AVMutableVideoCompositionInstruction instruction = AVMutableVideoCompositionInstruction.Create () as AVMutableVideoCompositionInstruction;
CMTimeRange timeRangeInstruction = new CMTimeRange ();
timeRangeInstruction.Start = CMTime.Zero;
timeRangeInstruction.Duration = mixComposition.Duration;
instruction.TimeRange = timeRangeInstruction;
AVAssetTrack videoTrack = mixComposition.TracksWithMediaType (AVMediaType.Video) [0];
AVMutableVideoCompositionLayerInstruction layerInstruction = AVMutableVideoCompositionLayerInstruction.FromAssetTrack (videoTrack);
instruction.LayerInstructions = new AVVideoCompositionLayerInstruction[] { layerInstruction };
List<AVVideoCompositionInstruction> instructions = new List<AVVideoCompositionInstruction> ();
instructions.Add (instruction);
videoComposition.Instructions = instructions.ToArray ();
var exportSession = new AVAssetExportSession (mixComposition, AVAssetExportSession.PresetMediumQuality);
exportSession.VideoComposition = videoComposition;
Console.WriteLine ("Original path is {0}", filepath);
string newFileName = Path.GetFileName (filepath);
newFileName = newFileName.Replace (".mp4", "_wm.mp4");
string directoryName = Path.GetDirectoryName (filepath);
string videoOutFilePath = Path.Combine (directoryName, newFileName);
Console.WriteLine ("New path is {0}", videoOutFilePath);
exportSession.OutputFileType = AVFileType.Mpeg4;
exportSession.OutputUrl = NSUrl.FromFilename (videoOutFilePath);
exportSession.ShouldOptimizeForNetworkUse = true;
exportSession.ExportAsynchronously (() => {
AVAssetExportSessionStatus status = exportSession.Status;
Console.WriteLine ("Done with handler. Status: " + status.ToString ());
switch (status) {
case AVAssetExportSessionStatus.Completed:
Console.WriteLine ("Sucessfully Completed");
if (File.Exists (videoOutFilePath)) {
Console.WriteLine ("Created!!");
} else
Console.WriteLine ("Failed");
break;
case AVAssetExportSessionStatus.Cancelled:
break;
case AVAssetExportSessionStatus.Exporting:
break;
case AVAssetExportSessionStatus.Failed:
Console.WriteLine ("Task failed => {0}", exportSession.Error);
Console.WriteLine (exportSession.Error.Description);
break;
case AVAssetExportSessionStatus.Unknown:
break;
case AVAssetExportSessionStatus.Waiting:
break;
default:
break;
}
});
if (File.Exists (videoOutFilePath))
return videoOutFilePath;
} catch (Exception ex) {
Console.WriteLine ("Error occured : {0}", ex.Message);
}
I keep getting the following error:
Task failed => Cannot Complete Export
Error Domain=AVFoundationErrorDomain Code=-11820 "Cannot Complete Export" UserInfo=0x18896070 {NSLocalizedRecoverySuggestion=Try exporting again., NSLocalizedDescription=Cannot Complete Export}
If I replace this
var exportSession = new AVAssetExportSession (mixComposition, AVAssetExportSession.PresetMediumQuality);
with
var exportSession = new AVAssetExportSession (videoAsset, AVAssetExportSession.PresetMediumQuality);
it works fine but with no watermark
Can anyone help with this??
So I managed to solve this issue by changing these lines
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipVideoTrack, CMTime.Zero, out error);
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipAudioTrack, CMTime.Zero, out error);
compositionVideoTracks.PreferredTransform = clipVideoTrack.PreferredTransform;
to this:
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipVideoTrack, CMTime.Zero, out error);
compositionAudioTrack.InsertTimeRange (timeRangeInAsset, clipAudioTrack, CMTime.Zero, out error);
compositionVideoTracks.PreferredTransform = clipVideoTrack.PreferredTransform;
Although I'm not sure why this worked. Could any one explain this?

CIDetector.RectDetector bounds to view bounds coordinates

So,
I am trying to display a rectanlge around a detected document (A4)
I am using an AVCaptureSession for the feed along with the AVCaptureStillImageOutput Output.
NSError Error = null;
Session = new AVCaptureSession();
AVCaptureDevice Device = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);
AVCaptureDeviceInput DeviceInput = AVCaptureDeviceInput.FromDevice(Device, out Error);
Session.AddInput(DeviceInput);
AVCaptureStillImageOutput CaptureOutput = new AVCaptureStillImageOutput();
CaptureOutput.OutputSettings = new NSDictionary(AVVideo.CodecKey, AVVideo.CodecJPEG) ;
Session.AddOutput(CaptureOutput);
I have a timer that takes the output and passes that to my handler
NSTimer.CreateRepeatingScheduledTimer(TimeSpan.Parse("00:00:02"), delegate
{
CaptureImageWithMetadata(CaptureOutput,CaptureOutput.Connections[0]);
});
I also have an AVCapturePreviewLayer with its bound being full screen (iPad Mini Portrait)
PreviewLayer = new AVCaptureVideoPreviewLayer(Session);
PreviewLayer.Frame = this.View.Frame;
PreviewLayer.VideoGravity = AVLayerVideoGravity.ResizeAspectFill;
this.View.Layer.AddSublayer(PreviewLayer);
PreviewLayer.ZPosition = (PreviewLayer.ZPosition - 1);
Below is the handler
private async void CaptureImageWithMetadata(AVCaptureStillImageOutput output, AVCaptureConnection connection)
{
var sampleBuffer = await output.CaptureStillImageTaskAsync(connection);
var imageData = AVCaptureStillImageOutput.JpegStillToNSData(sampleBuffer);
var image = CIImage.FromData(imageData);
var metadata = image.Properties.Dictionary.MutableCopy() as NSMutableDictionary;
CIContext CT = CIContext.FromOptions(null);
CIDetectorOptions OP = new CIDetectorOptions();
OP.Accuracy = FaceDetectorAccuracy.High;
OP.AspectRatio = 1.41f;
CIDetector CI = CIDetector.CreateRectangleDetector(CT, OP);
CIFeature[] HH = CI.FeaturesInImage(image,CIImageOrientation.BottomRight);
CGAffineTransform Transfer = CGAffineTransform.MakeScale(1, -1);
Transfer = CGAffineTransform.Translate(Transfer, 0, -this.View.Bounds.Size.Height);
if (HH.Length > 0)
{
CGRect RECT = CGAffineTransform.CGRectApplyAffineTransform(HH[0].Bounds, Transfer);
Console.WriteLine("start");
Console.WriteLine("IMAGE : "+HH[0].Bounds.ToString());
Console.WriteLine("SCREEN :"+RECT.ToString());
Console.WriteLine("end");
BB.Frame = RECT;
BB.Hidden = false;
}
}
Despite however after following a guid that suggested I need to convert the coordinates - my highlighter (green) is not surround the document, and i cant figure out why.
I am using CIImageOrientation.BottomRight just as test but no matter what i put here.. always the same result. See Images

Trimming video with Monotouch fails with "The operation could not be completed"

I am trying to trim a video to 5 seconds programmatically. Here is my implementation.
AVAssetExportSession exportSession= new AVAssetExportSession(videoAsset,AVAssetExportSession.PresetLowQuality.ToString());
int SystemVersion = Convert.ToInt16(UIDevice.CurrentDevice.SystemVersion.Split('.')[0]);
string filename;
if (SystemVersion >= 8)
{
var documents = NSFileManager.DefaultManager.GetUrls(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomain.User)[0].Path;
filename = Path.Combine(documents, "trimmed.mov");
}
else
{
var documents = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments); // iOS 7 and earlier
filename = Path.Combine(documents, "trimmed.mov");
}
outputUrl=new NSUrl(filename);
exportSession.OutputUrl = outputUrl;
CMTime start = new CMTime((long)1, 1);
CMTime duration = new CMTime((long)5, 1);
CMTimeRange range = new CMTimeRange();
range.Start=start;
range.Duration=duration;
exportSession.TimeRange = range;
exportSession.OutputFileType = AVFileType.QuickTimeMovie;
ExportTrimmedVideo( exportSession);
async void ExportTrimmedVideo(AVAssetExportSession exportSession)
{
await exportSession.ExportTaskAsync ();
if (exportSession.Status == AVAssetExportSessionStatus.Completed) {
InvokeOnMainThread (() => {
new UIAlertView ("Export Sucess", "Video is trimmed", null, "O K").Show ();
});
}
else
{
InvokeOnMainThread (() => {
new UIAlertView ("Export Falure", exportSession.Error.Description, null, "O K").Show ();
});
}
}
But in completion I am getting a Filed Status. Full NSError Description is as follows
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x7cebcf80 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x7cb08410 "The operation couldn’t be completed. (OSStatus error -12105.)", NSLocalizedFailureReason=An unknown error occurred (-12105)}
What am I possibly doing wrong ?
Edit
I have referred apple's documentation on Trimming video and have modified the above code with no positive effect as below.
var compatiblePresets= AVAssetExportSession.ExportPresetsCompatibleWithAsset(videoAsset).ToList();
var preset="";
if(compatiblePresets.Contains("AVAssetExportPresetLowQuality"))
{
preset="AVAssetExportPresetLowQuality";
}
else
{
preset=compatiblePresets.FirstOrDefault();
}
AVAssetExportSession exportSession= new AVAssetExportSession(videoAsset,preset);
int SystemVersion = Convert.ToInt16(UIDevice.CurrentDevice.SystemVersion.Split('.')[0]);
string filename;
if (SystemVersion >= 8)
{
var documents = NSFileManager.DefaultManager.GetUrls(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomain.User)[0].Path;
filename = Path.Combine(documents, "trimmed.mov");
}
else
{
var documents = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments); // iOS 7 and earlier
filename = Path.Combine(documents, "trimmed.mov");
}
outputUrl=new NSUrl(filename);
exportSession.OutputUrl = outputUrl;
exportSession.OutputFileType = AVFileType.QuickTimeMovie;
CMTime start = new CMTime((long)1, 600);
CMTime duration = new CMTime((long)5, 600);
CMTimeRange range = new CMTimeRange();
range.Start=start;
range.Duration=duration;
exportSession.TimeRange = range;
ExportTrimmedVideo( exportSession);
Try this code below. I modified exportSession.OutputUrl and how you initialize your CMTimeRange. Are you trimming it down to a 4 second clip?
var compatiblePresets= AVAssetExportSession.ExportPresetsCompatibleWithAsset(videoAsset).ToList();
var preset="";
if(compatiblePresets.Contains("AVAssetExportPresetLowQuality"))
{
preset="AVAssetExportPresetLowQuality";
}
else
{
preset=compatiblePresets.FirstOrDefault();
}
using (var exportSession = new AVAssetExportSession(videoAsset, preset))
{
int SystemVersion = Convert.ToInt16(UIDevice.CurrentDevice.SystemVersion.Split('.')[0]);
string filename;
if (SystemVersion >= 8)
{
var documents = NSFileManager.DefaultManager.GetUrls(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomain.User)[0].Path;
filename = Path.Combine(documents, "trimmed.mov");
}
else
{
var documents = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments); // iOS 7 and earlier
filename = Path.Combine(documents, "trimmed.mov");
}
exportSession.OutputUrl = NSUrl.FromFilename(filename);
exportSession.OutputFileType = AVFileType.QuickTimeMovie;
var range = new CMTimeRange();
range.Start = CMTime.FromSeconds (1, videoAsset.Duration.TimeScale);
range.Duration = CMTime.FromSeconds (5, videoAsset.Duration.TimeScale);
exportSession.TimeRange = range;
}
ExportTrimmedVideo( exportSession);

Resources