Xamarin.iOS - ExportAsynchronously fails - ios

Firstly I should mention I'm new a this (iOS and Xamarin).
I'm trying to watermark a video using a image and a text.
I've ported most of the code from the link: https://stackoverflow.com/a/22016800/5275669.
The entire code is pasted here:
try {
var videoAsset = AVUrlAsset.FromUrl (new NSUrl (filepath, false)) as AVUrlAsset;
AVMutableComposition mixComposition = AVMutableComposition.Create ();
var compositionVideoTracks = mixComposition.AddMutableTrack (AVMediaType.Video, 0);
AVAssetTrack clipVideoTrack = videoAsset.TracksWithMediaType (AVMediaType.Video) [0];
var compositionAudioTrack = mixComposition.AddMutableTrack (AVMediaType.Audio, 0);
AVAssetTrack clipAudioTrack = videoAsset.TracksWithMediaType (AVMediaType.Audio) [0];
NSError error;
CMTimeRange timeRangeInAsset = new CMTimeRange ();
timeRangeInAsset.Start = CMTime.Zero;
timeRangeInAsset.Duration = videoAsset.Duration;
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipVideoTrack, CMTime.Zero, out error);
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipAudioTrack, CMTime.Zero, out error);
compositionVideoTracks.PreferredTransform = clipVideoTrack.PreferredTransform;
CGSize sizeOfVideo = videoAsset.NaturalSize;
CATextLayer textOfvideo = (CATextLayer)CATextLayer.Create ();
textOfvideo.String = String.Format ("{0} {1}", DateTime.Now.ToLongTimeString (), "Test app");
textOfvideo.SetFont (CGFont.CreateWithFontName ("Helvetica"));
textOfvideo.FontSize = 50;
textOfvideo.AlignmentMode = CATextLayer.AlignmentCenter;
textOfvideo.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height / 6);
textOfvideo.ForegroundColor = new CGColor (255, 0, 0);
UIImage myImage = UIImage.FromFile ("Icon-Small.png");
CALayer layerCa = CALayer.Create ();
layerCa.Contents = myImage.CGImage;
layerCa.Frame = new CGRect (0, 0, 100, 100);
layerCa.Opacity = 0.65F;
CALayer optionalLayer = CALayer.Create ();
optionalLayer.AddSublayer (textOfvideo);
optionalLayer.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height);
optionalLayer.MasksToBounds = true;
CALayer parentLayer = CALayer.Create ();
CALayer videoLayer = CALayer.Create ();
parentLayer.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height);
videoLayer.Frame = new CGRect (0, 0, sizeOfVideo.Width, sizeOfVideo.Height);
parentLayer.AddSublayer (videoLayer);
parentLayer.AddSublayer (layerCa);
parentLayer.AddSublayer (textOfvideo);
AVMutableVideoComposition videoComposition = AVMutableVideoComposition.Create ();
videoComposition.RenderSize = sizeOfVideo;
videoComposition.FrameDuration = new CMTime (1, 30);
videoComposition.AnimationTool = AVVideoCompositionCoreAnimationTool.FromLayer (videoLayer, parentLayer);
AVMutableVideoCompositionInstruction instruction = AVMutableVideoCompositionInstruction.Create () as AVMutableVideoCompositionInstruction;
CMTimeRange timeRangeInstruction = new CMTimeRange ();
timeRangeInstruction.Start = CMTime.Zero;
timeRangeInstruction.Duration = mixComposition.Duration;
instruction.TimeRange = timeRangeInstruction;
AVAssetTrack videoTrack = mixComposition.TracksWithMediaType (AVMediaType.Video) [0];
AVMutableVideoCompositionLayerInstruction layerInstruction = AVMutableVideoCompositionLayerInstruction.FromAssetTrack (videoTrack);
instruction.LayerInstructions = new AVVideoCompositionLayerInstruction[] { layerInstruction };
List<AVVideoCompositionInstruction> instructions = new List<AVVideoCompositionInstruction> ();
instructions.Add (instruction);
videoComposition.Instructions = instructions.ToArray ();
var exportSession = new AVAssetExportSession (mixComposition, AVAssetExportSession.PresetMediumQuality);
exportSession.VideoComposition = videoComposition;
Console.WriteLine ("Original path is {0}", filepath);
string newFileName = Path.GetFileName (filepath);
newFileName = newFileName.Replace (".mp4", "_wm.mp4");
string directoryName = Path.GetDirectoryName (filepath);
string videoOutFilePath = Path.Combine (directoryName, newFileName);
Console.WriteLine ("New path is {0}", videoOutFilePath);
exportSession.OutputFileType = AVFileType.Mpeg4;
exportSession.OutputUrl = NSUrl.FromFilename (videoOutFilePath);
exportSession.ShouldOptimizeForNetworkUse = true;
exportSession.ExportAsynchronously (() => {
AVAssetExportSessionStatus status = exportSession.Status;
Console.WriteLine ("Done with handler. Status: " + status.ToString ());
switch (status) {
case AVAssetExportSessionStatus.Completed:
Console.WriteLine ("Sucessfully Completed");
if (File.Exists (videoOutFilePath)) {
Console.WriteLine ("Created!!");
} else
Console.WriteLine ("Failed");
break;
case AVAssetExportSessionStatus.Cancelled:
break;
case AVAssetExportSessionStatus.Exporting:
break;
case AVAssetExportSessionStatus.Failed:
Console.WriteLine ("Task failed => {0}", exportSession.Error);
Console.WriteLine (exportSession.Error.Description);
break;
case AVAssetExportSessionStatus.Unknown:
break;
case AVAssetExportSessionStatus.Waiting:
break;
default:
break;
}
});
if (File.Exists (videoOutFilePath))
return videoOutFilePath;
} catch (Exception ex) {
Console.WriteLine ("Error occured : {0}", ex.Message);
}
I keep getting the following error:
Task failed => Cannot Complete Export
Error Domain=AVFoundationErrorDomain Code=-11820 "Cannot Complete Export" UserInfo=0x18896070 {NSLocalizedRecoverySuggestion=Try exporting again., NSLocalizedDescription=Cannot Complete Export}
If I replace this
var exportSession = new AVAssetExportSession (mixComposition, AVAssetExportSession.PresetMediumQuality);
with
var exportSession = new AVAssetExportSession (videoAsset, AVAssetExportSession.PresetMediumQuality);
it works fine but with no watermark
Can anyone help with this??

So I managed to solve this issue by changing these lines
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipVideoTrack, CMTime.Zero, out error);
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipAudioTrack, CMTime.Zero, out error);
compositionVideoTracks.PreferredTransform = clipVideoTrack.PreferredTransform;
to this:
compositionVideoTracks.InsertTimeRange (timeRangeInAsset, clipVideoTrack, CMTime.Zero, out error);
compositionAudioTrack.InsertTimeRange (timeRangeInAsset, clipAudioTrack, CMTime.Zero, out error);
compositionVideoTracks.PreferredTransform = clipVideoTrack.PreferredTransform;
Although I'm not sure why this worked. Could any one explain this?

Related

How to change CreationDate Resource Value

I'm writing a recording app that enables the user to trim parts of previous recordings and concatenate them into one new recording.
My problem is: let's say I recorded an hour long track and I want to trim the first 2 minutes of that track. when I'll export these 2 minutes the creation date of this track will be "now", and I need it to match the date these 2 minutes actually took place.
So basically I'm trying to modify the tracks Url Resource Values, but I want to change only the creation date.
Is there a way to do this? or is there a way to add a new resource value key? or a way to attach the needed date to the url?
func trimStatringPoint(_ from: Date, startOffSet: TimeInterval, duration: TimeInterval, fileName: String, file: URL, completion: fileExportaionBlock?) {
if let asset = AVURLAsset(url: file) as AVAsset? {
var trimmedFileUrl = documentsDirectory().appendingPathComponent(fileName)
let exporter = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A)
exporter?.outputFileType = AVFileTypeAppleM4A
exporter?.outputURL = trimmedFileUrl
let start = CMTimeMake(Int64(startOffSet), 1)
let end = CMTimeMake(Int64(startOffSet + duration), 1)
exporter?.timeRange = CMTimeRangeFromTimeToTime(start, end)
exporter?.exportAsynchronously { handler in
if exporter?.status != AVAssetExportSessionStatus.completed {
print("Error while exporting \(exporter?.error?.localizedDescription ?? "unknown")")
completion?(nil)
return
}
}
//------------------------------------------------------
// this code needs to be replaced
do {
var resourceValus = URLResourceValues()
resourceValus.creationDate = from
try trimmedFileUrl.setResourceValues(resourceValus)
} catch {
deleteFile(atPath: trimmedFileUrl)
print("Error while setting date - \(error.localizedDescription)")
completion?(nil)
return
}
//------------------------------------------------------
completion?(trimmedFileUrl)
}
Have you tried mofifying metadata of the exported recording?
https://developer.apple.com/documentation/avfoundation/avmetadatacommonkeycreationdate
AVMutableMetadataItem *item = [AVMutableMetadataItem metadataItem];
metaItem.key = AVMetadataCommonKeyCreationDate;
metaItem.keySpace = AVMetadataKeySpaceCommon;
metaItem.value = [NSDate date];
NSArray *metadata = #{ metaItem };
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
exportSession.metadata = metadata;

CIDetector.RectDetector bounds to view bounds coordinates

So,
I am trying to display a rectanlge around a detected document (A4)
I am using an AVCaptureSession for the feed along with the AVCaptureStillImageOutput Output.
NSError Error = null;
Session = new AVCaptureSession();
AVCaptureDevice Device = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);
AVCaptureDeviceInput DeviceInput = AVCaptureDeviceInput.FromDevice(Device, out Error);
Session.AddInput(DeviceInput);
AVCaptureStillImageOutput CaptureOutput = new AVCaptureStillImageOutput();
CaptureOutput.OutputSettings = new NSDictionary(AVVideo.CodecKey, AVVideo.CodecJPEG) ;
Session.AddOutput(CaptureOutput);
I have a timer that takes the output and passes that to my handler
NSTimer.CreateRepeatingScheduledTimer(TimeSpan.Parse("00:00:02"), delegate
{
CaptureImageWithMetadata(CaptureOutput,CaptureOutput.Connections[0]);
});
I also have an AVCapturePreviewLayer with its bound being full screen (iPad Mini Portrait)
PreviewLayer = new AVCaptureVideoPreviewLayer(Session);
PreviewLayer.Frame = this.View.Frame;
PreviewLayer.VideoGravity = AVLayerVideoGravity.ResizeAspectFill;
this.View.Layer.AddSublayer(PreviewLayer);
PreviewLayer.ZPosition = (PreviewLayer.ZPosition - 1);
Below is the handler
private async void CaptureImageWithMetadata(AVCaptureStillImageOutput output, AVCaptureConnection connection)
{
var sampleBuffer = await output.CaptureStillImageTaskAsync(connection);
var imageData = AVCaptureStillImageOutput.JpegStillToNSData(sampleBuffer);
var image = CIImage.FromData(imageData);
var metadata = image.Properties.Dictionary.MutableCopy() as NSMutableDictionary;
CIContext CT = CIContext.FromOptions(null);
CIDetectorOptions OP = new CIDetectorOptions();
OP.Accuracy = FaceDetectorAccuracy.High;
OP.AspectRatio = 1.41f;
CIDetector CI = CIDetector.CreateRectangleDetector(CT, OP);
CIFeature[] HH = CI.FeaturesInImage(image,CIImageOrientation.BottomRight);
CGAffineTransform Transfer = CGAffineTransform.MakeScale(1, -1);
Transfer = CGAffineTransform.Translate(Transfer, 0, -this.View.Bounds.Size.Height);
if (HH.Length > 0)
{
CGRect RECT = CGAffineTransform.CGRectApplyAffineTransform(HH[0].Bounds, Transfer);
Console.WriteLine("start");
Console.WriteLine("IMAGE : "+HH[0].Bounds.ToString());
Console.WriteLine("SCREEN :"+RECT.ToString());
Console.WriteLine("end");
BB.Frame = RECT;
BB.Hidden = false;
}
}
Despite however after following a guid that suggested I need to convert the coordinates - my highlighter (green) is not surround the document, and i cant figure out why.
I am using CIImageOrientation.BottomRight just as test but no matter what i put here.. always the same result. See Images

Trimming video with Monotouch fails with "The operation could not be completed"

I am trying to trim a video to 5 seconds programmatically. Here is my implementation.
AVAssetExportSession exportSession= new AVAssetExportSession(videoAsset,AVAssetExportSession.PresetLowQuality.ToString());
int SystemVersion = Convert.ToInt16(UIDevice.CurrentDevice.SystemVersion.Split('.')[0]);
string filename;
if (SystemVersion >= 8)
{
var documents = NSFileManager.DefaultManager.GetUrls(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomain.User)[0].Path;
filename = Path.Combine(documents, "trimmed.mov");
}
else
{
var documents = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments); // iOS 7 and earlier
filename = Path.Combine(documents, "trimmed.mov");
}
outputUrl=new NSUrl(filename);
exportSession.OutputUrl = outputUrl;
CMTime start = new CMTime((long)1, 1);
CMTime duration = new CMTime((long)5, 1);
CMTimeRange range = new CMTimeRange();
range.Start=start;
range.Duration=duration;
exportSession.TimeRange = range;
exportSession.OutputFileType = AVFileType.QuickTimeMovie;
ExportTrimmedVideo( exportSession);
async void ExportTrimmedVideo(AVAssetExportSession exportSession)
{
await exportSession.ExportTaskAsync ();
if (exportSession.Status == AVAssetExportSessionStatus.Completed) {
InvokeOnMainThread (() => {
new UIAlertView ("Export Sucess", "Video is trimmed", null, "O K").Show ();
});
}
else
{
InvokeOnMainThread (() => {
new UIAlertView ("Export Falure", exportSession.Error.Description, null, "O K").Show ();
});
}
}
But in completion I am getting a Filed Status. Full NSError Description is as follows
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x7cebcf80 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x7cb08410 "The operation couldn’t be completed. (OSStatus error -12105.)", NSLocalizedFailureReason=An unknown error occurred (-12105)}
What am I possibly doing wrong ?
Edit
I have referred apple's documentation on Trimming video and have modified the above code with no positive effect as below.
var compatiblePresets= AVAssetExportSession.ExportPresetsCompatibleWithAsset(videoAsset).ToList();
var preset="";
if(compatiblePresets.Contains("AVAssetExportPresetLowQuality"))
{
preset="AVAssetExportPresetLowQuality";
}
else
{
preset=compatiblePresets.FirstOrDefault();
}
AVAssetExportSession exportSession= new AVAssetExportSession(videoAsset,preset);
int SystemVersion = Convert.ToInt16(UIDevice.CurrentDevice.SystemVersion.Split('.')[0]);
string filename;
if (SystemVersion >= 8)
{
var documents = NSFileManager.DefaultManager.GetUrls(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomain.User)[0].Path;
filename = Path.Combine(documents, "trimmed.mov");
}
else
{
var documents = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments); // iOS 7 and earlier
filename = Path.Combine(documents, "trimmed.mov");
}
outputUrl=new NSUrl(filename);
exportSession.OutputUrl = outputUrl;
exportSession.OutputFileType = AVFileType.QuickTimeMovie;
CMTime start = new CMTime((long)1, 600);
CMTime duration = new CMTime((long)5, 600);
CMTimeRange range = new CMTimeRange();
range.Start=start;
range.Duration=duration;
exportSession.TimeRange = range;
ExportTrimmedVideo( exportSession);
Try this code below. I modified exportSession.OutputUrl and how you initialize your CMTimeRange. Are you trimming it down to a 4 second clip?
var compatiblePresets= AVAssetExportSession.ExportPresetsCompatibleWithAsset(videoAsset).ToList();
var preset="";
if(compatiblePresets.Contains("AVAssetExportPresetLowQuality"))
{
preset="AVAssetExportPresetLowQuality";
}
else
{
preset=compatiblePresets.FirstOrDefault();
}
using (var exportSession = new AVAssetExportSession(videoAsset, preset))
{
int SystemVersion = Convert.ToInt16(UIDevice.CurrentDevice.SystemVersion.Split('.')[0]);
string filename;
if (SystemVersion >= 8)
{
var documents = NSFileManager.DefaultManager.GetUrls(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomain.User)[0].Path;
filename = Path.Combine(documents, "trimmed.mov");
}
else
{
var documents = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments); // iOS 7 and earlier
filename = Path.Combine(documents, "trimmed.mov");
}
exportSession.OutputUrl = NSUrl.FromFilename(filename);
exportSession.OutputFileType = AVFileType.QuickTimeMovie;
var range = new CMTimeRange();
range.Start = CMTime.FromSeconds (1, videoAsset.Duration.TimeScale);
range.Duration = CMTime.FromSeconds (5, videoAsset.Duration.TimeScale);
exportSession.TimeRange = range;
}
ExportTrimmedVideo( exportSession);

AVAssetExportSession - The video could not be composed

I am trying to do some basic Video Compositions in Xamarin / Monotouch and am having some success but am stuck what seems to be a rather simple task.
I record videos from the camera in portrait so I use AVAssetExportSession to rotate the videos. I have created a layer instructions to rotate the video which works fine. I am able to successfully export the video in the correct orientation.
The Issue:
When I add the audio track into the export I always get a failed response with this error:
Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo=0x1912c320 {NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}
If I dont set the videoComposition property on the exportSession the audio and video export perfectly fine just with the wrong orientation. If anyone could give mem some advice it would be greatly appreciated. Below is my code:
var composition = new AVMutableComposition();
var compositionTrackAudio = composition.AddMutableTrack(AVMediaType.Audio, 0);
var compositionTrackVideo = composition.AddMutableTrack(AVMediaType.Video, 0);
var videoCompositionInstructions = new AVVideoCompositionInstruction[files.Count];
var index = 0;
var renderSize = new SizeF(480, 480);
var _startTime = CMTime.Zero;
//AVUrlAsset asset;
var asset = new AVUrlAsset(new NSUrl(file, false), new AVUrlAssetOptions());
//var asset = AVAsset.FromUrl(new NSUrl(file, false));
//create an avassetrack with our asset
var videoTrack = asset.TracksWithMediaType(AVMediaType.Video)[0];
var audioTrack = asset.TracksWithMediaType(AVMediaType.Audio)[0];
//create a video composition and preset some settings
NSError error;
var assetTimeRange = new CMTimeRange { Start = CMTime.Zero, Duration = asset.Duration };
compositionTrackAudio.InsertTimeRange(new CMTimeRange
{
Start = CMTime.Zero,
Duration = asset.Duration,
}, audioTrack, _startTime, out error);
if (error != null) {
Debug.WriteLine (error.Description);
}
compositionTrackVideo.InsertTimeRange(assetTimeRange, videoTrack, _startTime, out error);
//create a video instruction
var transformer = new AVMutableVideoCompositionLayerInstruction
{
TrackID = videoTrack.TrackID,
};
var audioMix = new AVMutableAudioMix ();
var mixParameters = new AVMutableAudioMixInputParameters{
TrackID = audioTrack.TrackID
};
mixParameters.SetVolumeRamp (1.0f, 1.0f, new CMTimeRange {
Start = CMTime.Zero,
Duration = asset.Duration
});
audioMix.InputParameters = new [] { mixParameters };
var t1 = CGAffineTransform.MakeTranslation(videoTrack.NaturalSize.Height, 0);
//Make sure the square is portrait
var t2 = CGAffineTransform.Rotate(t1, (float)(Math.PI / 2f));
var finalTransform = t2;
transformer.SetTransform(finalTransform, CMTime.Zero);
//add the transformer layer instructions, then add to video composition
var instruction = new AVMutableVideoCompositionInstruction
{
TimeRange = assetTimeRange,
LayerInstructions = new []{ transformer }
};
videoCompositionInstructions[index] = instruction;
index++;
_startTime = CMTime.Add(_startTime, asset.Duration);
var videoComposition = new AVMutableVideoComposition();
videoComposition.FrameDuration = new CMTime(1 , (int)videoTrack.NominalFrameRate);
videoComposition.RenderScale = 1;
videoComposition.Instructions = videoCompositionInstructions;
videoComposition.RenderSize = renderSize;
var exportSession = new AVAssetExportSession(composition, AVAssetExportSession.PresetHighestQuality);
var filePath = _fileSystemManager.TempDirectory + DateTime.UtcNow.Ticks + ".mp4";
var outputLocation = new NSUrl(filePath, false);
exportSession.OutputUrl = outputLocation;
exportSession.OutputFileType = AVFileType.Mpeg4;
exportSession.VideoComposition = videoComposition;
exportSession.AudioMix = audioMix;
exportSession.ShouldOptimizeForNetworkUse = true;
exportSession.ExportAsynchronously(() =>
{
Debug.WriteLine(exportSession.Status);
switch (exportSession.Status)
{
case AVAssetExportSessionStatus.Failed:
{
Debug.WriteLine(exportSession.Error.Description);
Debug.WriteLine(exportSession.Error.DebugDescription);
break;
}
case AVAssetExportSessionStatus.Completed:
{
if (File.Exists(filePath))
{
_uploadService.AddVideoToVideoByteList(File.ReadAllBytes(filePath), ".mp4");
Task.Run(async () =>
{
await _uploadService.UploadVideo(_videoData);
});
}
break;
}
case AVAssetExportSessionStatus.Unknown:
{
break;
}
case AVAssetExportSessionStatus.Exporting:
{
break;
}
case AVAssetExportSessionStatus.Cancelled:
{
break;
}
}
});
So this was a really stupid mistake it was due to adding the audio track in before the video so the instructions must have been trying to apply the transform to the audio track rather than my video track.
My problem is that I forget to set the timeRange, it should be like this
let instruction = AVMutableVideoCompositionInstruction()
instruction.layerInstructions = [layer]
instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: videoDuration)
Note that AVMutableVideoCompositionInstruction.timeRange 's end time must be valid. It is different from AVAssetExportSession.timeRange
The time range to be exported from the source.
The default time range of an export session is kCMTimeZero to kCMTimePositiveInfinity, meaning that (modulo a possible limit on file length) the full duration of the asset will be exported.
You can observe this property using Key-value observing.

GetAudio Plugin for MvvmCross

I'm developing an plugin for MvvmCross to GetAudio from the device. But I'm getting an error in the Touch implementation. I've already took a look at here, here and here.
But nothing fixed my problem.
Well, so far I have:
var audioDelegate = new AudioDelegate();
audioDelegate.AudioAvailable = ProcessAudio;
mediaPicker = new MPMediaPickerController();
mediaPicker.Delegate = audioDelegate;
mediaPicker.AllowsPickingMultipleItems = false;
modalHost.PresentModalViewController (mediaPicker, true);
To start the picker from audio, where AudioDelegate is:
private class AudioDelegate : MPMediaPickerControllerDelegate
{
public EventHandler<MvxAudioRecorderEventArgs> AudioAvailable;
public override void MediaItemsPicked (MPMediaPickerController sender, MPMediaItemCollection mediaItemCollection)
{
if (mediaItemCollection.Count < 1)
{
return;
}
MvxAudioRecorderEventArgs eventArg = new MvxAudioRecorderEventArgs (mediaItemCollection.Items [0].AssetURL);
AudioAvailable (this, eventArg);
}
}
And then in ProcessAudio:
private void ProcessMedia(object sender, UIImagePickerMediaPickedEventArgs e)
{
var assetURL = e.MediaUrl;
NSDictionary dictionary = null;
var assetExtension = e.MediaUrl.Path.Split ('.') [1];
var songAsset = new AVUrlAsset (assetURL, dictionary);
var exporter = new AVAssetExportSession (songAsset, AVAssetExportSession.PresetPassthrough.ToString ());
exporter.OutputFileType = (assetExtension == "mp3") ? "com.apple.quicktime-movie" : AVFileType.AppleM4A;
var manager = new NSFileManager ();
var count = 0;
string filePath = null;
do
{
var extension = "mov";//( NSString *)UTTypeCopyPreferredTagWithClass(( CFStringRef)AVFileTypeQuickTimeMovie, kUTTagClassFilenameExtension);
var fileNameNoExtension = "AUD_" + Guid.NewGuid ().ToString ();
var fileName = string.Format ("{0}({1})", fileNameNoExtension, count);
filePath = Environment.GetFolderPath (Environment.SpecialFolder.MyDocuments) + "/";
filePath = filePath + fileName + "." + extension;
count++;
} while (manager.FileExists (filePath));
var outputURL = new NSUrl (filePath);//should be something in objective C... => [NSURL fileURLWithPath:filePath];
exporter.OutputUrl = outputURL;
exporter.ExportAsynchronously ( () =>
{
var exportStatus = exporter.Status;
switch (exportStatus)
{
case AVAssetExportSessionStatus.Unknown:
case AVAssetExportSessionStatus.Completed:
{
//remove the .mov from file and make it available for callback mediaAvailable()...
break;
}
case AVAssetExportSessionStatus.Waiting:
case AVAssetExportSessionStatus.Cancelled:
case AVAssetExportSessionStatus.Exporting:
case AVAssetExportSessionStatus.Failed:
default:
{
var exportError = exporter.Error;
if(assumeCancelled != null)
{
assumeCancelled();
}
break;
}
}
});
mediaPicker.DismissViewController(true, () => { });
modalHost.NativeModalViewControllerDisappearedOnItsOwn();
}
But it is always with Status.Failed with error:
LocalizedDescription: The operation could not be completed
Description: Error Domain=AVFoundationErrorDomain Code=-11800
"The operation could not be completed" UserInfo=0x1476cce0
{NSLocalizedDescription=The operation could not be completed,
NSUnderlyingError=0x1585da90 "The operation couldn’t be completed.
(OSStatus error -12780.)", NSLocalizedFailureReason=An unknown error
occurred (-12780)}
Can anyone help me?
Thanks in regard,
Found the solution. As I thought the error was in the outputURL file.
Changed it to:
var count = 0;
string filePath = null;
do
{
var extension = "mp3.mov";//( NSString *)UTTypeCopyPreferredTagWithClass(( CFStringRef)AVFileTypeQuickTimeMovie, kUTTagClassFilenameExtension);
var fileNameNoExtension = "AUD_" + Guid.NewGuid ().ToString ();
var fileName = (count == 0) ? fileNameNoExtension : string.Format ("{0}({1})", fileNameNoExtension, count);
filePath = NSBundle.MainBundle.BundlePath + "/../tmp/" + fileName; /* HERE WAS THE FIRST PROBLEM, USE NSBUNDLE... */
filePath = filePath + fileName + "." + extension;
count++;
} while (manager.FileExists (filePath));
var outputURL = NSUrl.FromFilename(filePath); /* HERE WAAS THE SECOND PROBLEM, CREATE IT WITH FROMFILENAME INSTEAD OF NEW... */
And then in the export, just remove the .mov extension...
var withoutMOVPath = outputURL.Path.Remove(outputURL.Path.Length - 4);
NSError error = null;
manager.Move (outputURL.Path, withoutMOVPath, out error);
if(error != null && assumeCancelled != null)
{
assumeCancelled();
return;
}
var mediaStream = new FileStream (withoutMOVPath, FileMode.Open);
mediaAvailable (mediaStream);
break;

Resources