How to convert mov to MP4 in xamarin iOS. I am using AVAssetExportSession. Last few seconds of video gets chipped off.. Any solution for this to retain full video duration.
Here is my code.
var basePath = Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData);
var inputFilePath = Path.Combine(basePath, Path.ChangeExtension(_vm.Model.AnswerName, "mov"));
var outputFilePath = Path.Combine(basePath, _vm.Model.AnswerName);
var asset = AVAsset.FromUrl(NSUrl.FromFilename(inputFilePath));
AVAssetExportSession export = new AVAssetExportSession(asset, AVAssetExportSession.PresetHighestQuality);
export.OutputUrl = NSUrl.FromFilename(outputFilePath);
export.OutputFileType = AVFileType.Mpeg4;
export.ShouldOptimizeForNetworkUse = true;
try
{
export.ExportTaskAsync().Wait();
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine(ex);
}
var fileHelper = new FileHelper();
// If Export successful then delete the MOV file
if (fileHelper.FileExists(_vm.Model.AnswerName))
{
fileHelper.DeleteFile(Path.ChangeExtension(_vm.Model.AnswerName, "mov"));
}
Related
We are using captured video functionality in my application xamarin forms ios. After the captured the video we are compressing the video and uploaded it to the webserver. The audio on compressed videos is distorted. Here is the video compression code:
string fileName = Path.GetFileNameWithoutExtension(inputPath);
string outputPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Personal), fileName + ".mp4");
TaskCompletionSource<string> tcs = new TaskCompletionSource<string>();
var outputURL = new NSUrl(outputPath);
NSError error = new NSError();
using (AVAsset asset = AVAsset.FromUrl(NSUrl.FromFilename(inputPath)))
using (AVAssetExportSession export = new AVAssetExportSession(asset, AVAssetExportSessionPreset.MediumQuality))
{
export.OutputFileType = AVFileType.Mpeg4;
export.ShouldOptimizeForNetworkUse = true;
export.OutputUrl = NSUrl.FromFilename(outputURL.ToString());
NSFileManager.DefaultManager.Remove(outputURL, out error);
export.ExportTaskAsync();
Stream exportStream = File.OpenRead(tmp);
How to resolve this in xamarin forms ios?
I have a small png image I like to show in an imageview using Xamarin.Android.
I am downloading the file using the following code:
private void Download()
{
var url = "https://hns.d7u.de/v4/images/hvvstoerungen_facebook.png";
var directory = Environment.GetFolderPath(Environment.SpecialFolder.Personal) + "/myapp/";
var fileName = url.Substring(url.LastIndexOf("/") +1);
var path = directory + fileName;
System.Net.WebClient wC = new System.Net.WebClient();
wC.Headers.Add(System.Net.HttpRequestHeader.AcceptEncoding, "gzip");
wC.DownloadDataCompleted += WC_DownloadDataCompleted;
wC.DownloadDataAsync(new Uri(url), path);
}
private void WC_DownloadDataCompleted(object sender, System.Net.DownloadDataCompletedEventArgs e)
{
var path = e.UserState.ToString();
var bytes = e.Result;
if (File.Exists(path))
File.Delete(path);
if (!File.Exists(path))
File.WriteAllBytes(path, bytes);
}
It is stored at /data/user/0/myapp/files/hns/hvvstoerungen_facebook.png and a File.Exists(...) returns a true for that path. So I am sure, that the file is downloaded and it exists.
When I want to show it in the ImageView, I do it like this:
if (System.IO.File.Exists(imageFilePath))
{
Android.Net.Uri andrUri = Android.Net.Uri.Parse(imageFilePath);
ImageIcon.SetImageURI(andrUri);
//Also not working:
//Bitmap bitmap = BitmapFactory.DecodeFile(imageFilePath);
//ImageIcon.SetImageBitmap(bitmap);
//And also not working:
//Android.Net.Uri andrUri = Android.Net.Uri.Parse(imageFilePath);
//Bitmap bmp = BitmapFactory.DecodeStream(Android.App.Application.Context.ContentResolver.OpenInputStream(andrUri));
//ImageIcon.SetImageBitmap(bmp);
}
The Output windows shows the following when the image should be shown:
02-01 23:41:24.770 E/Drawable(19815): Unable to decode stream:
android.graphics.ImageDecoder$DecodeException: Failed to create image
decoder with message 'unimplemented'Input contained an error. 02-01
23:41:24.770 W/ImageView(19815): resolveUri failed on bad bitmap uri:
/data/user/0/myapp/files/hns/hvvstoerungen_facebook.png
But I cannot figured out what exactly this means.
One additional thing is: If I run the app in a brand new Android Emulator instance, this image and all other of its kind are not shown.
If I run the app in an old Android Emulator instance, where the app was already running before but on Android.Forms basis, the old images that were known by the old project are shown while the newly downloaded images are not. All images are in the same folder and I cannot see any differences between them.
Does anyone has an Idea?
Edit:
My working version has the following Download() Method instead:
private void Download()
{
var noCompression = new string[] { ".png", ".jpg", ".jpeg", ".gif", ".zip", ".7z", ".mp3", ".mp4" };
var url = "https://hns.d7u.de/v4/images/hvvstoerungen_facebook.png";
var directory = Environment.GetFolderPath(Environment.SpecialFolder.Personal) + "/myapp/";
var fileName = url.Substring(url.LastIndexOf("/") +1);
var path = directory + fileName;
System.Net.WebClient wC = new System.Net.WebClient();
if (!noCompression.Contains(url.Substring(url.LastIndexOf('.'))))
wC.Headers.Add(System.Net.HttpRequestHeader.AcceptEncoding, "gzip");
wC.DownloadDataCompleted += WC_DownloadDataCompleted;
wC.DownloadDataAsync(new Uri(url), path);
}
You could try the code below.
Download the image from Url:
public Bitmap GetImageBitmapFromUrl(string url)
{
Bitmap imageBitmap = null;
using (var webClient = new WebClient())
{
var imageBytes = webClient.DownloadData(url);
if (imageBytes != null && imageBytes.Length > 0)
{
imageBitmap = BitmapFactory.DecodeByteArray(imageBytes, 0, imageBytes.Length);
}
}
return imageBitmap;
}
Usage:
bitmap = GetImageBitmapFromUrl("https://hns.d7u.de/v4/images/hvvstoerungen_facebook.png");
And save the image as png:
void ExportBitmapAsPNG(Bitmap bitmap)
{
var folderPath = System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal);
filePath = System.IO.Path.Combine(folderPath, "test.png");
var stream = new FileStream(filePath, FileMode.Create);
bitmap.Compress(Bitmap.CompressFormat.Png, 100, stream);
stream.Close();
}
Usage:
ExportBitmapAsPNG(bitmap);
Check the file exists or not and set into the imageview:
if (File.Exists(filePath))
{
Bitmap myBitmap = BitmapFactory.DecodeFile(filePath);
imageview.SetImageBitmap(myBitmap);
}
i create vlc playlist xspf file now i want to stream it with Libvlcsharp codes as follows.
this code working fine with video file but .xspf file not respond
code :
LibVLCSharp.Shared.LibVLC _libVLC;
MediaPlayer _mp;
_libVLC = new LibVLCSharp.Shared.LibVLC("-I", "null");
_mp = new MediaPlayer(_libVLC);
string xspf_file = #"D:\sample.xspf";
var media1 = new Media(_libVLC,xspf_file,FromType.FromPath);
media1.AddOption(":sout=#transcode{acodec=mp4a,ab=128,channels=2,samplerate=44100,scodec=none}:udp{dst=224.2.2.26:2226,mux=ts}");
_mp.Play(media1);
MessageBox.Show("play success");
Core.Initialize();
using(var libVLC = new LibVLC())
{
var media = new Media(libVLC, "playlist.xspf");
await media.Parse(MediaParseOptions.ParseNetwork);
using (var mp = new MediaPlayer(media.SubItems.First()))
{
media.Dispose();
mp.Play();
Console.ReadKey();
}
}
I am trying to do some basic Video Compositions in Xamarin / Monotouch and am having some success but am stuck what seems to be a rather simple task.
I record videos from the camera in portrait so I use AVAssetExportSession to rotate the videos. I have created a layer instructions to rotate the video which works fine. I am able to successfully export the video in the correct orientation.
The Issue:
When I add the audio track into the export I always get a failed response with this error:
Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo=0x1912c320 {NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}
If I dont set the videoComposition property on the exportSession the audio and video export perfectly fine just with the wrong orientation. If anyone could give mem some advice it would be greatly appreciated. Below is my code:
var composition = new AVMutableComposition();
var compositionTrackAudio = composition.AddMutableTrack(AVMediaType.Audio, 0);
var compositionTrackVideo = composition.AddMutableTrack(AVMediaType.Video, 0);
var videoCompositionInstructions = new AVVideoCompositionInstruction[files.Count];
var index = 0;
var renderSize = new SizeF(480, 480);
var _startTime = CMTime.Zero;
//AVUrlAsset asset;
var asset = new AVUrlAsset(new NSUrl(file, false), new AVUrlAssetOptions());
//var asset = AVAsset.FromUrl(new NSUrl(file, false));
//create an avassetrack with our asset
var videoTrack = asset.TracksWithMediaType(AVMediaType.Video)[0];
var audioTrack = asset.TracksWithMediaType(AVMediaType.Audio)[0];
//create a video composition and preset some settings
NSError error;
var assetTimeRange = new CMTimeRange { Start = CMTime.Zero, Duration = asset.Duration };
compositionTrackAudio.InsertTimeRange(new CMTimeRange
{
Start = CMTime.Zero,
Duration = asset.Duration,
}, audioTrack, _startTime, out error);
if (error != null) {
Debug.WriteLine (error.Description);
}
compositionTrackVideo.InsertTimeRange(assetTimeRange, videoTrack, _startTime, out error);
//create a video instruction
var transformer = new AVMutableVideoCompositionLayerInstruction
{
TrackID = videoTrack.TrackID,
};
var audioMix = new AVMutableAudioMix ();
var mixParameters = new AVMutableAudioMixInputParameters{
TrackID = audioTrack.TrackID
};
mixParameters.SetVolumeRamp (1.0f, 1.0f, new CMTimeRange {
Start = CMTime.Zero,
Duration = asset.Duration
});
audioMix.InputParameters = new [] { mixParameters };
var t1 = CGAffineTransform.MakeTranslation(videoTrack.NaturalSize.Height, 0);
//Make sure the square is portrait
var t2 = CGAffineTransform.Rotate(t1, (float)(Math.PI / 2f));
var finalTransform = t2;
transformer.SetTransform(finalTransform, CMTime.Zero);
//add the transformer layer instructions, then add to video composition
var instruction = new AVMutableVideoCompositionInstruction
{
TimeRange = assetTimeRange,
LayerInstructions = new []{ transformer }
};
videoCompositionInstructions[index] = instruction;
index++;
_startTime = CMTime.Add(_startTime, asset.Duration);
var videoComposition = new AVMutableVideoComposition();
videoComposition.FrameDuration = new CMTime(1 , (int)videoTrack.NominalFrameRate);
videoComposition.RenderScale = 1;
videoComposition.Instructions = videoCompositionInstructions;
videoComposition.RenderSize = renderSize;
var exportSession = new AVAssetExportSession(composition, AVAssetExportSession.PresetHighestQuality);
var filePath = _fileSystemManager.TempDirectory + DateTime.UtcNow.Ticks + ".mp4";
var outputLocation = new NSUrl(filePath, false);
exportSession.OutputUrl = outputLocation;
exportSession.OutputFileType = AVFileType.Mpeg4;
exportSession.VideoComposition = videoComposition;
exportSession.AudioMix = audioMix;
exportSession.ShouldOptimizeForNetworkUse = true;
exportSession.ExportAsynchronously(() =>
{
Debug.WriteLine(exportSession.Status);
switch (exportSession.Status)
{
case AVAssetExportSessionStatus.Failed:
{
Debug.WriteLine(exportSession.Error.Description);
Debug.WriteLine(exportSession.Error.DebugDescription);
break;
}
case AVAssetExportSessionStatus.Completed:
{
if (File.Exists(filePath))
{
_uploadService.AddVideoToVideoByteList(File.ReadAllBytes(filePath), ".mp4");
Task.Run(async () =>
{
await _uploadService.UploadVideo(_videoData);
});
}
break;
}
case AVAssetExportSessionStatus.Unknown:
{
break;
}
case AVAssetExportSessionStatus.Exporting:
{
break;
}
case AVAssetExportSessionStatus.Cancelled:
{
break;
}
}
});
So this was a really stupid mistake it was due to adding the audio track in before the video so the instructions must have been trying to apply the transform to the audio track rather than my video track.
My problem is that I forget to set the timeRange, it should be like this
let instruction = AVMutableVideoCompositionInstruction()
instruction.layerInstructions = [layer]
instruction.timeRange = CMTimeRange(start: kCMTimeZero, duration: videoDuration)
Note that AVMutableVideoCompositionInstruction.timeRange 's end time must be valid. It is different from AVAssetExportSession.timeRange
The time range to be exported from the source.
The default time range of an export session is kCMTimeZero to kCMTimePositiveInfinity, meaning that (modulo a possible limit on file length) the full duration of the asset will be exported.
You can observe this property using Key-value observing.
**I have an air app for iOS I have been developing. I am trying to capture a picture, save the file to the storage directory (not Camera Roll), and save the file name in an sqlite db.
I have tried so many different variations of this, but when it comes to writing the filestream to save the app hangs. Testing on iPad 3. Does ANYONE have a suggestion? This has been driving me nuts for days. I have searched the web but I am stumped.**
public var temp:File; // File Object to save name in database
protected function selectPicture():void
{
myCam = new CameraUI();
myCam.addEventListener(MediaEvent.COMPLETE, onComplete);
myCam.launch(MediaType.IMAGE);
}
protected function onComplete(event:MediaEvent):void {
//imageProblem.source = event.data.file.url;
var cameraUI:CameraUI = event.target as CameraUI;
var mediaPromise:MediaPromise = event.data;
var mpLoader:Loader = new Loader();
mpLoader.contentLoaderInfo.addEventListener(Event.COMPLETE, onMediaPromiseLoaded);
mpLoader.loadFilePromise(mediaPromise);
}
private function onMediaPromiseLoaded(e:Event):void
{
var mpLoaderInfo:LoaderInfo = e.target as LoaderInfo;
mpLoaderInfo.removeEventListener(Event.COMPLETE, onMediaPromiseLoaded);
this.imageProblem.source = mpLoaderInfo.loader;
var stream:FileStream = new FileStream();
stream.addEventListener(Event.COMPLETE, showComplete);
stream.addEventListener(IOErrorEvent.IO_ERROR, showError);
try{
this.messages.text = "Starting";
stream.open( temp, FileMode.WRITE );
stream.writeBytes(mpLoaderInfo.bytes);
stream.close();
}catch(e:Error){
this.messages.text = e.message;
}
}
protected function showError(e:IOErrorEvent):void{
this.messages.text = e.toString();
}
protected function showComplete(e:Event):void{
this.messages.text = "Completed Writing";
this.imgName.text = temp.url;
imagefile = temp;
deleteFlag = 1;
}
Application hangs due to you are trying to use file operation in Sync mode.
You need to use Async Mode operation instead of sync mode file operation.
stream.openAsync( temp, FileMode.WRITE );
Try with this
var stream:FileStream = new FileStream();
stream.addEventListener(Event.COMPLETE, showComplete);
stream.addEventListener(IOErrorEvent.IO_ERROR, showError);
stream.openAsync( temp, FileMode.WRITE );
stream.writeBytes(mpLoaderInfo.bytes);
stream.close();
Note when using async operation you need not use try catch.For handling error listen IOErrorEvent will catch if any error occurs.
I finally got this to work, I added comments below in the code to explain why it wasn't working.
public var temp:File;
protected function selectPicture():void
{
myCam = new CameraUI();
myCam.addEventListener(MediaEvent.COMPLETE, onComplete);
myCam.launch(MediaType.IMAGE);
}
protected function onComplete(event:MediaEvent):void {
//imageProblem.source = event.data.file.url;
var cameraUI:CameraUI = event.target as CameraUI;
var mediaPromise:MediaPromise = event.data;
var mpLoader:Loader = new Loader();
mpLoader.contentLoaderInfo.addEventListener(Event.COMPLETE, onMediaPromiseLoaded);
mpLoader.loadFilePromise(mediaPromise);
}
private function onMediaPromiseLoaded(e:Event):void
{
var mpLoaderInfo:LoaderInfo = e.target as LoaderInfo;
mpLoaderInfo.removeEventListener(Event.COMPLETE, onMediaPromiseLoaded);
this.imageProblem.source = mpLoaderInfo.loader;
/// Here was the solution
var bitmapDataA:BitmapData = new BitmapData(mpLoaderInfo.width, mpLoaderInfo.height);
bitmapDataA.draw(mpLoaderInfo.content,null,null,null,null,true);
/// I had to cast the loaderInfo as BitmapData
var bitmapDataB:BitmapData = resizeimage(bitmapDataA, int(mpLoaderInfo.width / 4), int(mpLoaderInfo.height/ 4)); // function to shrink the image
var c:CameraRoll = new CameraRoll();
c.addBitmapData(bitmapDataB);
var now:Date = new Date();
var f:File = File.applicationStorageDirectory.resolvePath("IMG" + now.seconds + now.minutes + ".jpg");
var stream:FileStream = new FileStream()
stream.open(f, FileMode.WRITE);
// Then had to redraw and encode as a jpeg before writing the file
var bytes:ByteArray = new ByteArray();
bytes = bitmapDataB.encode(new Rectangle(0,0, int(mpLoaderInfo.width / 4) , int(mpLoaderInfo.height / 4)), new JPEGEncoderOptions(80), bytes);
stream.writeBytes(bytes,0,bytes.bytesAvailable);
stream.close();
this.imgName.text = f.url;
imagefile = f;
deleteFlag = 1;
}