Artifacts in UIImage when loaded in background - ios

I'm trying to load images for GMGridView cells. The issue is that the image loading process is not that fast so I decided to go multithreading. I created a good all-in-one class for background image loading. Here are it's contents:
public void LoadImageIntoView (string imageURL, UIImageView imageView, int index)
{
rwl.AcquireReaderLock (Timeout.Infinite);
if (disposed)
return;
UIImage image;
lock (locker) {
cache.TryGetValue (imageURL, out image);
}
if (image != null)
imageView.Image = image;
else {
new Thread (() => {
if (MediaLoader.IsFileCached (imageURL))
LoadImage (index, imageURL);
else {
MediaLoader loader = new MediaLoader ();
loader.OnCompleteDownload += (object sender, OnCompleteDownloadEventArgs e) => {
if (e.Success)
LoadImage (index, e.FileURL);
};
loader.GetFileAsync (imageURL, false, DownloadPriority.Low);
}
}).Start ();
}
rwl.ReleaseReaderLock ();
}
private void LoadImage (int index, string imageURL)
{
rwl.AcquireReaderLock (Timeout.Infinite);
if (disposed)
return;
string pathToFile = MediaLoader.GetCachedFilePath (imageURL);
UIImage uiImage = UIImage.FromFile (pathToFile);;
// Load the image
if (uiImage != null) {
lock (locker) {
cache [imageURL] = uiImage;
}
BeginInvokeOnMainThread (() => InsertImage (false, index, uiImage));
}
rwl.ReleaseReaderLock ();
}
private void InsertImage (bool secondTime, int index, UIImage image)
{
rwl.AcquireReaderLock (Timeout.Infinite);
if (disposed)
return;
UIImageView imageView = FireGetImageViewCallback (index);
if (imageView != null) {
CATransition transition = CATransition.CreateAnimation ();
transition.Duration = 0.3f;
transition.TimingFunction = CAMediaTimingFunction.FromName(CAMediaTimingFunction.EaseInEaseOut);
transition.Type = CATransition.TransitionFade;
imageView.Layer.AddAnimation (transition, null);
imageView.Image = image;
} else {
if (!secondTime) {
new Thread (() => {
Thread.Sleep (150);
BeginInvokeOnMainThread (() => InsertImage (true, index, image));
}).Start ();
}
}
rwl.ReleaseReaderLock ();
}
I have also tried this code for image loading inside the LoadImage method:
UIImage loadedImage = UIImage.FromFile (pathToFile);
CGImage image = loadedImage.CGImage;
if (image != null) {
CGColorSpace colorSpace = CGColorSpace.CreateDeviceRGB ();
// Create a bitmap context from the image's specifications
CGBitmapContext bitmapContext = new CGBitmapContext (null, image.Width, image.Height, image.BitsPerComponent, image.Width * 4, colorSpace, CGImageAlphaInfo.PremultipliedFirst);
bitmapContext.ClearRect (new System.Drawing.RectangleF (0, 0, image.Width, image.Height));
// Draw the image into the bitmap context and retrieve the
// decompressed image
bitmapContext.DrawImage (new System.Drawing.RectangleF (0, 0, image.Width, image.Height), image);
CGImage decompressedImage = bitmapContext.ToImage ();
// Create a UIImage
uiImage = new UIImage (decompressedImage);
// Release everything
colorSpace.Dispose ();
decompressedImage.Dispose ();
bitmapContext.Dispose ();
image.Dispose ();
}
When I build and try my app it appears that from time to time images returned by my ImageLoader have artifacts inside them. Sometimes it can be white rectangles at random locations, sometimes it can be some unexpectedly colored pixels. I'll be very happy to hear a solution to this problem as the app is about to go to AppStore and this issue is a big headache.
P.S. FireGetImageViewCallback returns an UIImageView via a delegate which I set in the class's constructor. Cache is a Dictionary , locker is just an object, rwl is a ReaderWriterLock instance.

The problem was solved by using GCD instead of usual C# threading. It puts the task into the queue which makes them run one after another, but not simultaneously. This worked perfect except the fact that when you scroll down the huge list of images and all of them go into the queue, it will take much time for currently visible rows to be filled with images. That's why I applied some sort of optimization: when my ImageLoader's LoadImageIntoView method is called, it is also provided with an index, so ImageLoader knows which row was acquired last. In the task I check whether the cell, the image of which is going to be downloaded, is currently visible and if not it simply returns, allowing the next task to execute. Here's some code that illustrates this approach:
private void LoadImage (int index, string imageURL)
{
DispatchQueue.GetGlobalQueue (DispatchQueuePriority.Low).DispatchAsync (() => {
rwl.AcquireReaderLock (Timeout.Infinite);
if (disposed)
return;
bool shouldDownload = false;
lastAcquiredIndexRwl.AcquireReaderLock (Timeout.Infinite);
shouldDownload = index <= (lastAcquiredIndex + visibleRange) && index >= (lastAcquiredIndex - visibleRange);
lastAcquiredIndexRwl.ReleaseReaderLock ();
if (shouldDownload) {
string pathToFile = MediaLoader.GetCachedFilePath (imageURL);
UIImage uiImage = null;
// Load the image
CGDataProvider dataProvider = new CGDataProvider (pathToFile);
CGImage image = null;
if (pathToFile.IndexOf (".png") != -1)
image = CGImage.FromPNG (dataProvider, null, false, CGColorRenderingIntent.Default);
else
image = CGImage.FromJPEG (dataProvider, null, false, CGColorRenderingIntent.Default);
if (image != null) {
CGColorSpace colorSpace = CGColorSpace.CreateDeviceRGB ();
// Create a bitmap context from the image's specifications
CGBitmapContext bitmapContext = new CGBitmapContext (null, image.Width, image.Height, image.BitsPerComponent, image.Width * 4, colorSpace, CGImageAlphaInfo.PremultipliedFirst | (CGImageAlphaInfo)CGBitmapFlags.ByteOrder32Little);
colorSpace.Dispose ();
bitmapContext.ClearRect (new System.Drawing.RectangleF (0, 0, image.Width, image.Height));
// Draw the image into the bitmap context and retrieve the
// decompressed image
bitmapContext.DrawImage (new System.Drawing.RectangleF (0, 0, image.Width, image.Height), image);
image.Dispose ();
CGImage decompressedImage = bitmapContext.ToImage ();
bitmapContext.Dispose ();
uiImage = new UIImage (decompressedImage);
decompressedImage.Dispose ();
}
if (uiImage != null) {
lock (locker) {
cache [imageURL] = uiImage;
}
DispatchQueue.MainQueue.DispatchAsync (() => InsertImage (false, index, uiImage));
}
}
rwl.ReleaseReaderLock ();
});
}

Related

Nativescript BitmapFactory resize does not work in iOS

I am using the BitmapFactory plugin to rezize pictures to save them as thumnails
let mutable = BitmapFactory.makeMutable(imageSource);
let rotatedBitmap = BitmapFactory.asBitmap(mutable).dispose((bmp) =>
{
return bmp.resize("80,80");
});
in Android it works perfectly, but in iOS the image returned is not resized. I read in some issues that this is because the iOS uses the default scaling..?
does anybody knnow how to resize images to square thumbnaile with a given size ?
I use undermentioned function...
imageSource = imageSourceModule.fromFile(filepath);
try
{
var mutable = BitmapFactory.makeMutable(imageSource);
var ThumbBitmap = BitmapFactory.asBitmap(mutable).dispose((bmp) =>
{
var optisizestring = "25,25";
test = bmp.resize(optisizestring);
base64JPEG = test.toBase64(BitmapFactory.OutputFormat.JPEG, 75);
img = imageSource.fromBase64(base64JPEG);
resolve( "data:image/png;base64," + base64JPEG);
global.gc(); // tried with \ without garbage collection here
});
} catch(ex) { console.log("errds " + ex); resolve (null);}
(in the BitmapFactory-plugin) BitmapFactory.ios.js i changed the resize function
iOSImage.prototype._resize = function(newSize) {
var oldImg = this._nativeObject;
try {
var ns = CGSizeMake(newSize.width, newSize.height);
UIGraphicsBeginImageContext(ns);
var context = UIGraphicsGetCurrentContext();
oldImg.drawInRect(CGRectMake(0, 0,ns.width, ns.height));
return UIGraphicsGetImageFromCurrentImageContext();
}
finally
{
UIGraphicsEndImageContext();
}
};
// // [INTERNAL] _resize() Original
// iOSImage.prototype._resize = function(newSize) {
// var oldImg = this._nativeObject;
// try {
// var ns = CGSizeMake(newSize.width, newSize.height);
// UIGraphicsBeginImageContextWithOptions(ns, false, 0.0);
// oldImg.drawInRect(CGRectMake(0, 0,
// ns.width, ns.height));
// return new iOSImage(UIGraphicsGetImageFromCurrentImageContext());
// }
// finally {
// UIGraphicsEndImageContext();
// }
// };

iOS Xamarin Attaching a picture to the Email Body - Null Exception

I am using my first Xamarin iOS app to take a picture, extract meta data of the picture, attach the picture to an email body and send email.
// (1.) Take a photo with the Camera //
partial void BtnCamera_TouchUpInside(UIButton sender)
{
UIImagePickerController imagePicker = new UIImagePickerController();
imagePicker.PrefersStatusBarHidden();
imagePicker.SourceType = UIImagePickerControllerSourceType.Camera;
// handle saving picture and extracting meta-data from picture //
imagePicker.FinishedPickingMedia += Handle_FinishedPickingMedia;
// present //
PresentViewController(imagePicker, true, () => { });
}
// (2.) Saves the image to the phone and then extracts metadata //
protected void Handle_FinishedPickingMedia(object sender, UIImagePickerMediaPickedEventArgs e)
{
//NSUrl url = null;
try
{
#region Save Image and Get Meta-data
// Save Image before processing for meta-data //
SaveImagetoPhone(e);
// Get meta-data from saved image //
GetImageMetaData(e);
// (3.) Save picture to the phone and extract photo url //
private static void SaveImagetoPhone(UIImagePickerMediaPickedEventArgs e)
{
NSUrl url = null;
void ImageData(PHAsset asset)
{
if (asset == null)
throw new Exception("PHAsset is null");
PHImageManager.DefaultManager.RequestImageData(asset, null, (data, dataUti, orientation, info) =>
{
url = info.ValueForKey(new NSString("PHImageFileURLKey")) as NSUrl;
PhotoDataClass._file = url.Path;
}); }
PHAsset phAsset;
if (e.ReferenceUrl == null)
{
e.OriginalImage?.SaveToPhotosAlbum((image, error) =>
{
if (error == null)
{
var options = new PHFetchOptions
{
FetchLimit = 1,
SortDescriptors = new[] { new NSSortDescriptor("creationDate", true) }
};
phAsset = PHAsset.FetchAssets(options).LastOrDefault() as PHAsset;
ImageData(phAsset);
}
});
}
else
{
phAsset = PHAsset.FetchAssets(new[] { e.ReferenceUrl }, null).FirstOrDefault() as PHAsset;
ImageData(phAsset);
}
}
At this point I have successfully extracted the URL for the file:-
"/var/mobile/Media/DCIM/100APPLE/IMG_0036.JPG"
However when I run my email function, even though the url is populated, I get a null exception without any explanation that I know of.
// (4.) Email functionality //
https://developer.xamarin.com/recipes/ios/shared_resources/email/send_an_email/
partial void BtnMessageDone_TouchUpInside(UIButton sender)
{
MFMailComposeViewController mailController;
if (MFMailComposeViewController.CanSendMail)
{
StringBuilder htmlBodyMail = FormatEmailBody();
mailController = new MFMailComposeViewController();
// do mail operations here
mailController.SetToRecipients(new string[] { "xxx.yy#email.com" });
mailController.SetSubject("mail test");
mailController.SetMessageBody(htmlBodyMail.ToString(), false);
UIImage img = UIImage.FromFile(PhotoDataClass._file);
mailController.AddAttachmentData(img.AsJPEG(), "image/JPG", "Image.JPG");
mailController.Finished += (object s, MFComposeResultEventArgs args) =>
{
Console.WriteLine(args.Result.ToString());
args.Controller.DismissViewController(true, null);
};
this.PresentViewController(mailController, true, null);
}
}
The null exception occurs on the line inside the email functionality:-
mailController.AddAttachmentData(img.AsJPEG(), "image/JPG", "Image.JPG");
From your code, the image will be null when you use UIImage img = UIImage.FromFile(PhotoDataClass._file);. So this null exception will be thrown.
We should use PHAsset to retrieve image from system photo library instead of using the url directly. You can store the LocalIdentifier to the class PhotoDataClass, then retrieve the PHAsset through this identifier when you want to use it later.
Modify the ImageData method:
void ImageData(PHAsset asset)
{
if (asset == null)
throw new Exception("PHAsset is null");
PhotoDataClass.AssetIdentifier = asset.LocalIdentifier;
}
Then BtnMessageDone_TouchUpInside method can be:
MFMailComposeViewController mailController;
if (MFMailComposeViewController.CanSendMail)
{
var results = PHAsset.FetchAssetsUsingLocalIdentifiers(new string[] { PhotoDataClass.AssetIdentifier }, null);
foreach (PHAsset asset in results)
{
if (asset.LocalIdentifier == PhotoDataClass.AssetIdentifier)
{
PHImageManager.DefaultManager.RequestImageData(asset, null, (data, dataUti, orientation, info) =>
{
StringBuilder htmlBodyMail = FormatEmailBody();
mailController = new MFMailComposeViewController();
// do mail operations here
...
UIImage img = UIImage.LoadFromData(data);
mailController.AddAttachmentData(img.AsJPEG(), "image/JPG", "Image.JPG");
...
});
}
}
}
Moreover That LastOrDefault() is not what I think it is
This is because your NSSortDescriptor, use SortDescriptors = new[] { new NSSortDescriptor("creationDate", false) } or phAsset = PHAsset.FetchAssets(options).FirstOrDefault() as PHAsset; to get your photo you just capture from your camera.

Android Attach Image between text with SpannableStringBuilder

Sory for my english. I want attach image from gallery and show in edit text with SpannableStringBuilder. I have success for get Image Path from Gallery. After picture selected, Picture Show. But, for the second attach image picture not show and first image attach became a text. any one give me solution ? big thanks.
this is my code:
private void IntentPict() {
Intent intent = new Intent();
intent.setType("image/*");
intent.setAction(Intent.ACTION_GET_CONTENT);
startActivityForResult(Intent.createChooser(intent, "Select File"),MY_INTENT_CLICK);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK){
if (requestCode == MY_INTENT_CLICK){
if (null == data) return;
String selectedImagePath;
Uri selectedImageUri = data.getData();
//there is no problem with get path
selectedImagePath = ImageFilePath.getPath(getApplicationContext(), selectedImageUri);
//pathImag is ListArray<String>
pathImg.add(selectedImagePath);
addImageBetweentext(pathImg);
}
}
}
private void addImageBetweentext(ArrayList<String> listPath) {
SpannableStringBuilder ssb = new SpannableStringBuilder();
for(int i=0;i<listPath.size();i++){
String path = listPath.get(i);
Drawable drawable = Drawable.createFromPath(path);
drawable.setBounds(0, 0, 400, 400);
ssb.append(mBodyText+"\n");
ssb.setSpan(new ImageSpan(drawable), ssb.length()-(path.length()+1),
ssb.length()-1, Spannable.SPAN_EXCLUSIVE_EXCLUSIVE);
}
mBodyText.setText(ssb);
}
Here's something that should work. I am loading resource drawables instead, just because it was quicker for me to validate, but I tried to keep as mich as possible for your original flow and variable names:
Drawable[] drawables = new Drawable[2];
drawables[0] = getResources().getDrawable(R.drawable.img1);
drawables[1] = getResources().getDrawable(R.drawable.img2);
SpannableStringBuilder ssb = new SpannableStringBuilder();
for (int i = 0; i < drawables.length; i++) {
Drawable drawable = drawables[i];
drawable.setBounds(0, 0, 400, 400);
String newStr = drawable.toString() + "\n";
ssb.append(newStr);
ssb.setSpan(
new ImageSpan(drawable),
ssb.length() - newStr.length(),
ssb.length() - "\n".length(),
Spannable.SPAN_INCLUSIVE_EXCLUSIVE);
}
mBodyText.setText(ssb);
Long story short: the start position for the incremental builder needs to be set to the current length of it, minus what you just added.

How to perform affine transform on stream?

I want to rotate image, which I get from memoryStream.
I've tried to use UIView:
private Stream TransformImageFromStream(MemoryStream stream, CGAffineTransform transform)
{
var bytes = stream.ToBytes ();
var data = NSData.FromArray (bytes);
var image = UIImage.LoadFromData (data);
var uiImage = new UIImageView (image);
uiImage.Transform = transform;
var result = uiImage.Image.AsPNG ().AsStream ();
var testBytes = result.ToBytes ();
if (testBytes [0] == bytes [0])
{
// throw new Exception ("test failed");
}
return result;
}
But transform never applies as it always do with UIView within graphic canvas.
I've found somewhere that it will work with using ApplyTransform directly to CIImage
private Stream TransformImageFromStream(MemoryStream stream, CGAffineTransform transform)
{
var bytes = stream.ToBytes ();
using (var data = NSData.FromArray (bytes))
using (var ciImage = CIImage.FromData (data))
using (var transformedImage = ciImage.ImageByApplyingTransform (transform))
using (var uiImage = UIImage.FromImage(ciImage))
using (var uiImageView = new UIImageView(uiImage))
{
var result = uiImage.AsPNG ();
return null;
}
}
But UIImage doesnt want to convert it AsPng() in order to convert it to stream. In this case I've noticed that CGImage is empty, most of properties set to 0. Perhaps, there is some way to convert the CIImage itself, without any wrapping?
I have no more clues to what to do.

Save UIImage to personal folder and then load it via UIImage.FromFile

I´ve done a picture selector via UIImagePickerController. Because of the memory issues this one has I want to save the selected image to disc and if needed load it from filepath. But I can´t manage to get it working.
If i bind the original image directly it is displayed with no problems.
File.Exists in the code returns true but image in the last line is null if watched in debugger.. Thank you very much for your help!
NSData data = originalImage.AsPNG();
string path = Environment.GetFolderPath (Environment.SpecialFolder.Personal);
string pathTempImage = Path.Combine(path, "tempImage.png");
byte[] tempImage = new byte[data.Length];
File.WriteAllBytes(pathTempImage, tempImage);
if(File.Exists(pathTempImage))
{
int i = 0;
}
UIImage image = UIImage.FromFile(pathTempImage);
Update
This is the code that works for me:
void HandleFinishedPickingMedia (object sender, UIImagePickerMediaPickedEventArgs e)
{
_view.DismissModalViewControllerAnimated (true);
BackgroundWorker bw = new BackgroundWorker();
bw.DoWork += delegate(object bwsender, DoWorkEventArgs e2) {
// determine what was selected, video or image
bool isImage = false;
switch(e.Info[UIImagePickerController.MediaType].ToString()) {
case "public.image":
Console.WriteLine("Image selected");
isImage = true;
break;
case "public.video":
Console.WriteLine("Video selected");
break;
}
// get common info (shared between images and video)
NSUrl referenceURL = e.Info[new NSString("UIImagePickerControllerReferenceUrl")] as NSUrl;
if (referenceURL != null)
Console.WriteLine("Url:"+referenceURL.ToString ());
// if it was an image, get the other image info
if(isImage) {
// get the original image
originalImage = e.Info[UIImagePickerController.OriginalImage] as UIImage;
if(originalImage != null) {
NSData data = originalImage.AsPNG();
_picture = new byte[data.Length];
ImageResizer resizer = new ImageResizer(originalImage);
resizer.RatioResize(200,200);
string path = Environment.GetFolderPath (Environment.SpecialFolder.Personal);
string pathTempImage = Path.Combine(path, "tempImage.png");
string filePath = Path.Combine(path, "OriginalImage.png");
NSData dataTempImage = resizer.ModifiedImage.AsPNG();
byte[] tempImage = new byte[dataTempImage.Length];
System.Runtime.InteropServices.Marshal.Copy(dataTempImage.Bytes,tempImage,0,Convert.ToInt32(tempImage.Length));
//OriginalImage
File.WriteAllBytes(filePath, _picture);
//TempImag
File.WriteAllBytes(pathTempImage, tempImage);
UIImage image = UIImage.FromFile(pathTempImage);
_view.InvokeOnMainThread (delegate {
templateCell.BindDataToCell(appSelectPicture.Label, image);
});
_picture = null;
}
} else { // if it's a video
// get video url
NSUrl mediaURL = e.Info[UIImagePickerController.MediaURL] as NSUrl;
if(mediaURL != null) {
Console.WriteLine(mediaURL.ToString());
}
}
// dismiss the picker
};
bw.RunWorkerAsync();
bw.RunWorkerCompleted += HandleRunWorkerCompleted;
}
byte[] tempImage = new byte[data.Length];
File.WriteAllBytes(pathTempImage, tempImage);
You're not copying the image data to your allocated array before saving it. That result in a large empty file that is not a valid image.
Try using one of the NSData.Save overloads, like:
NSError error;
data.Save (pathTempImage, NSDataWritingOptions.FileProtectionNone, out error);
That will allow you to avoid allocating the byte[] array.

Resources