UIImagePickerController not working anymore - ios

In my Xamarin.iOS app, I have code to use the camera to take a photo:
var cameraPicker = new UIImagePickerController();
cameraPicker.SourceType = UIImagePickerControllerSourceType.Camera;
cameraPicker.MediaTypes = new string[] { UTType.Image, UTType.Movie };
cameraPicker.FinishedPickingMedia += Handle_FinishedPickingMedia;
cameraPicker.Canceled += Handle_Canceled;
NavigationController.PresentViewController(cameraPicker, true, null);
This was working before but now nothing happens when I run this code.

Related

didFinishPickingMediaWithInfo not returning UIImagePickerControllerOriginalImage

I'm using a UIImagePickerController to select a photo and later edit it using TOCropViewController:
if (UIImagePickerController.IsSourceTypeAvailable(UIImagePickerControllerSourceType.PhotoLibrary))
{
var picker = new UIImagePickerController
{
SourceType = UIImagePickerControllerSourceType.PhotoLibrary,
WeakDelegate = this
};
PresentViewController(picker, true, () =>
{
});
}
Most of the times it works correctly, but for some images didFinishPickingMediaWithInfo is not returning UIImagePickerControllerOriginalImage.
[Export("imagePickerController:didFinishPickingMediaWithInfo:")]
private void FinishedPickingMedia(UIImagePickerController picker, NSDictionary dic)
{
//Here img is sometimes null, because UIImagePickerControllerOriginalImage is not found
var img = dic.ObjectForKey(new NSString("UIImagePickerControllerOriginalImage")) as UIImage;
picker.DismissViewController(true, () =>
{
try
{
_cropViewController = new TOCropViewController(TOCropViewCroppingStyle.Default, img)
{
AspectRatioLockEnabled = true,
AspectRatioPickerButtonHidden = true,
AspectRatioPreset = TOCropViewControllerAspectRatioPreset.Square,
Delegate = _croppingDelegate,
ResetAspectRatioEnabled = false
};
PresentViewController(_cropViewController, true, null);
}
catch (Exception ex)
{
Crashes.TrackError(ex);
}
});
}
So for example when things are ok I'm getting a result such as:
"UIImagePickerControllerMediaType" = "public.image";
"UIImagePickerControllerOriginalImage" = <UIImage:0x280a3c360 anonymous {820, 512}>;
"UIImagePickerControllerReferenceURL" = assets-library://asset/asset.JPG?id=4EDD6F79-865E-48C9-AB91-99E58E5F323B&ext=JPG;
"UIImagePickerControllerImageURL" = file:///private/var/mobile/Containers/Data/Application/8FB0FD8E-98A4-4C28-9687-E33BCE53D460/tmp/2107BB84-ECF3-47E0-9538-B380B29E7BE9.jpeg;
When I'm not getting the UIImagePickerControllerOriginalImage the result is:
"UIImagePickerControllerMediaType" = "public.image";
"UIImagePickerControllerReferenceURL" = assets-library://asset/asset.JPG?id=567381A3-D0AC-4124-9CCD-C526E0074F83&ext=JPG;
And I don't get a UIImage to edit.
What can cause this behaviour and what can be done to resolve it?

Memory Leak with Phonegap Cordova Web Audio

I've seen this question asked here already : Web Audio API Memory Leaks on Mobile Platforms but there doesn't seem to be any response to it. I've tried lots of different variations - setting variables to null once I'm finished with them, or declaring the variables within scope, but it appears that the AudioContext (or OfflineAudioContext ) is not being Garbage collected after each operation. I'm using this with Phonegap, on an IOS, so the browser is safari. Any ideas how to solve this memory leak?
Here is my code:
function readVocalsToBuffer(file){
console.log('readVocalsToBuffer');
var reader = new FileReader();
reader.onloadend = function(evt){
var x = audioContext.decodeAudioData(evt.target._result, function(buffer){
if(!buffer){
console.log('error decoding file to Audio Buffer');
return;
}
window.voiceBuffer = buffer;
loadBuffers();
});
}
reader.readAsArrayBuffer(file);
}
function loadBuffers(){
console.log('loadBuffers');
try{
var bufferLoader = new BufferLoader(
audioContext,
[
"."+window.srcSong
],
createOffLineContext
);
bufferLoader.load()
}
catch(e){
console.log(e.message);
}
}
function createOffLineContext(bufferList){
console.log('createOfflineContext');
offline = new webkitOfflineAudioContext(2, window.voiceBuffer.length, 44100);
var vocalSource = offline.createBufferSource();
vocalSource.buffer = window.voiceBuffer;
vocalSource.connect(offline.destination);
var backing = offline.createBufferSource();
backing.buffer = bufferList[0];
backing.connect(offline.destination);
vocalSource.start(0);
backing.start(0);
offline.oncomplete = function(ev){
vocalSource.stop(0);
backing.stop(0);
vocalSource.disconnect(0);
backing.disconnect(0);
delete vocalSource;
delete backing;
delete window.voiceBuffer;
window.renderedFile = ev.renderedBuffer;
var bufferR = ev.renderedBuffer.getChannelData(0);
var bufferL = ev.renderedBuffer.getChannelData(1);
var interleaved = interleave(bufferL, bufferR);
var dataview = encodeWAV(interleaved);
window.audioBlob = new Blob([dataview], {type: 'Wav'});
saveFile();
}
offline.startRendering();
}
function interleave(inputL, inputR){
console.log('interleave');
var length = inputL.length + inputR.length;
var result = new Float32Array(length);
var index = 0,
inputIndex = 0;
while (index < length){
result[index++] = inputL[inputIndex];
result[index++] = inputR[inputIndex];
inputIndex++;
}
return result;
}
function saveFile(){
offline = null;
console.log('saveFile');
delete window.renderedFile;
window.requestFileSystem(LocalFileSystem.PERSISTENT, 0, onFSSuccess, fail);
}

UIImagePickerController is showing last clicked image. I am using Monotouch/Xamarin Studio

In my project, when I launch camera first time it works fine. when I launch camera second time, I see image last clicked in view finder. I am not sure what's causing this.
Can anyone please help me here?
following is code block to launch camera:
UIImagePickerController imagePicker = new UIImagePickerController();
// Handle media selected.
var documentsDirectory = Environment.GetFolderPath (Environment.SpecialFolder.MyDocuments);
imagePicker.FinishedPickingMedia += (sender, e) => {
UIImage image = (UIImage)e.Info.ObjectForKey(
new NSString("UIImagePickerControllerOriginalImage"));
if (image != null)
{
this.InvokeOnMainThread(() => {
this.clickedImage.Image = image;
image.SaveToPhotosAlbum(delegate(UIImage img, NSError err){
});
string pngFilename = System.IO.Path.Combine (documentsDirectory, "Photo.png"); // hardcoded filename, overwrites each time
NSData imgData = image.AsPNG();
NSError SaveErr = null;
if (imgData.Save(pngFilename, false, out SaveErr))
{
Console.WriteLine("saved as " + pngFilename);
} else {
Console.WriteLine("NOT saved as" + pngFilename + " because" + SaveErr.LocalizedDescription);
}
});
}
DismissViewController(true,null);
};
// Handle cancellation of picker.
imagePicker.Canceled += (sender, e) => {
DismissViewController(true,null);
};
btnCameraDisplay1.SetTitle("Take Picture", UIControlState.Normal);
btnCameraDisplay1.Font = UIFont.SystemFontOfSize(19);
btnCameraDisplay1.SetTitleColor(UIColor.Black, UIControlState.Normal);
btnCameraDisplay1.TouchUpInside += delegate(object sender, EventArgs e)
{
if(UIImagePickerController.IsSourceTypeAvailable(UIImagePickerControllerSourceType.Camera))
{
imagePicker.SourceType = UIImagePickerControllerSourceType.Camera;
imagePicker.AllowsEditing = false;
this.PresentViewController(imagePicker, true,null);
}
else{
alertView = new UIAlertView();
alertView.AddButton("OK");
alertView.Message = "No camera available in this device.";
alertView.AlertViewStyle = UIAlertViewStyle.Default;
alertView.Show();
}
};
I ran into the same Problem. Some searching here on Stackoverflow helped:
It might be a problem with Xamarins implementation of the FinishedPickingMedia Event. (from here https://stackoverflow.com/a/20503817/383658)
Solution:
Switch from the Event System to Delegates as explained here:
https://stackoverflow.com/a/20035698/383658
So basically move your code from:
imagePicker.FinishedPickingMedia += (sender, e) => {}
to your new UIImagePickerControllerDelegate delegates method:
public override void FinishedPickingMedia (UIImagePickerController picker, NSDictionary info)
{}

iOS saving images in flashbuilder air mobile app with CameraUI

**I have an air app for iOS I have been developing. I am trying to capture a picture, save the file to the storage directory (not Camera Roll), and save the file name in an sqlite db.
I have tried so many different variations of this, but when it comes to writing the filestream to save the app hangs. Testing on iPad 3. Does ANYONE have a suggestion? This has been driving me nuts for days. I have searched the web but I am stumped.**
public var temp:File; // File Object to save name in database
protected function selectPicture():void
{
myCam = new CameraUI();
myCam.addEventListener(MediaEvent.COMPLETE, onComplete);
myCam.launch(MediaType.IMAGE);
}
protected function onComplete(event:MediaEvent):void {
//imageProblem.source = event.data.file.url;
var cameraUI:CameraUI = event.target as CameraUI;
var mediaPromise:MediaPromise = event.data;
var mpLoader:Loader = new Loader();
mpLoader.contentLoaderInfo.addEventListener(Event.COMPLETE, onMediaPromiseLoaded);
mpLoader.loadFilePromise(mediaPromise);
}
private function onMediaPromiseLoaded(e:Event):void
{
var mpLoaderInfo:LoaderInfo = e.target as LoaderInfo;
mpLoaderInfo.removeEventListener(Event.COMPLETE, onMediaPromiseLoaded);
this.imageProblem.source = mpLoaderInfo.loader;
var stream:FileStream = new FileStream();
stream.addEventListener(Event.COMPLETE, showComplete);
stream.addEventListener(IOErrorEvent.IO_ERROR, showError);
try{
this.messages.text = "Starting";
stream.open( temp, FileMode.WRITE );
stream.writeBytes(mpLoaderInfo.bytes);
stream.close();
}catch(e:Error){
this.messages.text = e.message;
}
}
protected function showError(e:IOErrorEvent):void{
this.messages.text = e.toString();
}
protected function showComplete(e:Event):void{
this.messages.text = "Completed Writing";
this.imgName.text = temp.url;
imagefile = temp;
deleteFlag = 1;
}
Application hangs due to you are trying to use file operation in Sync mode.
You need to use Async Mode operation instead of sync mode file operation.
stream.openAsync( temp, FileMode.WRITE );
Try with this
var stream:FileStream = new FileStream();
stream.addEventListener(Event.COMPLETE, showComplete);
stream.addEventListener(IOErrorEvent.IO_ERROR, showError);
stream.openAsync( temp, FileMode.WRITE );
stream.writeBytes(mpLoaderInfo.bytes);
stream.close();
Note when using async operation you need not use try catch.For handling error listen IOErrorEvent will catch if any error occurs.
I finally got this to work, I added comments below in the code to explain why it wasn't working.
public var temp:File;
protected function selectPicture():void
{
myCam = new CameraUI();
myCam.addEventListener(MediaEvent.COMPLETE, onComplete);
myCam.launch(MediaType.IMAGE);
}
protected function onComplete(event:MediaEvent):void {
//imageProblem.source = event.data.file.url;
var cameraUI:CameraUI = event.target as CameraUI;
var mediaPromise:MediaPromise = event.data;
var mpLoader:Loader = new Loader();
mpLoader.contentLoaderInfo.addEventListener(Event.COMPLETE, onMediaPromiseLoaded);
mpLoader.loadFilePromise(mediaPromise);
}
private function onMediaPromiseLoaded(e:Event):void
{
var mpLoaderInfo:LoaderInfo = e.target as LoaderInfo;
mpLoaderInfo.removeEventListener(Event.COMPLETE, onMediaPromiseLoaded);
this.imageProblem.source = mpLoaderInfo.loader;
/// Here was the solution
var bitmapDataA:BitmapData = new BitmapData(mpLoaderInfo.width, mpLoaderInfo.height);
bitmapDataA.draw(mpLoaderInfo.content,null,null,null,null,true);
/// I had to cast the loaderInfo as BitmapData
var bitmapDataB:BitmapData = resizeimage(bitmapDataA, int(mpLoaderInfo.width / 4), int(mpLoaderInfo.height/ 4)); // function to shrink the image
var c:CameraRoll = new CameraRoll();
c.addBitmapData(bitmapDataB);
var now:Date = new Date();
var f:File = File.applicationStorageDirectory.resolvePath("IMG" + now.seconds + now.minutes + ".jpg");
var stream:FileStream = new FileStream()
stream.open(f, FileMode.WRITE);
// Then had to redraw and encode as a jpeg before writing the file
var bytes:ByteArray = new ByteArray();
bytes = bitmapDataB.encode(new Rectangle(0,0, int(mpLoaderInfo.width / 4) , int(mpLoaderInfo.height / 4)), new JPEGEncoderOptions(80), bytes);
stream.writeBytes(bytes,0,bytes.bytesAvailable);
stream.close();
this.imgName.text = f.url;
imagefile = f;
deleteFlag = 1;
}

Integrate Admob in iOS App which uses monogame

I am trying to do this for a few days, and still cannot make it. And I tried to find some samples but they are all on Android. Did anyone succeed to integrate admob on iOS?
I had some problems with AdMob bindings from monotouch-bindings repository. But then I switched to AlexTouch.GoogleAdMobAds bindings and them works just great. You can find sample of using AlexTouch.GoogleAdMobAds in README on Github. Is is quite simple, but if you'll need some help - feel free to ask more detailed question.
// code for admob "using AlexTouch.GoogleAdMobAds"
UIViewController vc = new UIViewController ();
UIViewController controller = UIApplication.SharedApplication.Windows [0].RootViewController;
var ad = new GADBannerView (GADAdSizeCons.SmartBannerLandscape, new PointF (0, 0))
{
AdUnitID = "ADMOB_ID",
RootViewController = vc
};
ad.Hidden = false;
ad.DidReceiveAd += delegate {
ad.Hidden = false;
ad.Frame = new System.Drawing.RectangleF (0, (int) 0, (int) (ad.Bounds.Width), (int) (ad.Bounds.Height));
Console.WriteLine ("AD Received");
};
ad.DidFailToReceiveAdWithError += delegate(object sender, GADBannerViewDidFailWithErrorEventArgs e) {
ad.Hidden = true;
Console.WriteLine (e.Error);
};
ad.WillPresentScreen += delegate {
Console.WriteLine ("showing new screen");
};
ad.WillLeaveApplication += delegate {
Console.WriteLine ("I will leave application");
};
ad.WillDismissScreen += delegate {
Console.WriteLine ("Dismissing opened screen");
};
ad.UserInteractionEnabled = true;
vc.View.AddSubview(ad);
vc.View.Frame = new System.Drawing.RectangleF(0f, 0f, (int)(ad.Bounds.Width), (int)(ad.Bounds.Height));
controller.View.AddSubview(vc.View);
Task.Factory.StartNew(() => {
while (true)
{
Console.WriteLine("Requesting Ad");
InvokeOnMainThread (delegate {
GADRequest r = new GADRequest();
ad.LoadRequest(r);
});
System.Threading.Thread.Sleep(30000);
}
});

Resources