UIImage setImage is very, very slow - ios

I'm trying to load an image from a local URL using the following:
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL:fileURL]];
[self.imageView setImage:image];
NSLog(#"imageView set");
So I see in the console "imageView set" almost immediately, but it takes a very long time for it to be reflected in the UI (sometimes a few minutes!).
Any idea why this is happening?

This happened to me when I was setting the image in a background thread (in which I was downloading the image file). Just make sure your code is run in the main thread. The image will change immediately.
// In a background thread...
UIImage *image = ... // Build or download the image
dispatch_async(dispatch_get_main_queue(), ^{
[self.imageView setImage:image]; // Run in main thread (UI thread)
});

You should load up instruments and see what exactly it is doing.
For starters, you should do what you can to avoid I/O on the drawing thread. Also, are there other I/O requests at the same time?
A UIImage does not necessarily need to be a singular bitmap representation -- it may be backed by a cache and/or loaded lazily. So just because an 'image' is 'set', does not mean that the optimal bitmap has been loaded into memory and prepared for rendering -- it may be deferred until render (draw) is requested.
Profiling OTOH will tell you (generally) why it is taking longer than expected.

Image loading from NSData takes more time.
Instead you can simply use the following code:
UIImage *image = [UIImage imageWithContentsOfFile:fileURL];
Try it.

So I tried the following:
dispatch_async(dispatch_get_global_queue(0, 0), ^{
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL:fileURL]];
[self.imageView setImage:image];
NSLog(#"imageView set");
});
And it is quite a bit faster. Not sure why though.

Related

IOS warning on background thread

I am downloading image on background thread and setting this image to UIImageView on main thread. But it is giving me some warning like
Using low GPU priority for background rendering
Also please check the below code for reference
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^(void) {
int randomImgNum = [[contentDict valueForKey:#"imageIndex"]intValue];
UIImage *image = [UIImage imageNamed:[Utils getImageFromIndex:randomImgNum]];
UIImage *newBlurredImage = [image blurredImageWithRadius:5.0];
dispatch_sync(dispatch_get_main_queue(), ^(void) {
customVieww.imgCardView.image = image;
[self setBlurrImage:newBlurredImage];
});
});
Please let me know what could be the issue and how we can resolve it.
Also please let me know for any explanation if I miss anything.
Thank you in advance.
if you want to work on the main view you have to use the main queue
because it can cause a problem if you use a background queue to change the UIView
Update code:
dispatch_async(dispatch_get_main_queue(), ^{
int randomImgNum = [[contentDict valueForKey:#"imageIndex"]intValue];
UIImage *image = [UIImage imageNamed:[Utils getImageFromIndex:randomImgNum]];
UIImage *newBlurredImage = [image blurredImageWithRadius:5.0];
dispatch_sync(dispatch_get_main_queue(), ^(void) {
customVieww.imgCardView.image = image;
[self setBlurrImage:newBlurredImage];
});
});
check this :
According to apple documentation, iOS does not allow its GPU to any background application for obvious reason that its not on Foreground.
https://developer.apple.com/library/content/documentation/3DDrawing/Conceptual/OpenGLES_ProgrammingGuide/ImplementingaMultitasking-awareOpenGLESApplication/ImplementingaMultitasking-awareOpenGLESApplication.html

Size limit in UIImageView?

I'm trying to display a rather big image in a UIImageView. The image downloads correctly (using a NSSessionDownloadTask) and I can create a UIImage from it, however, nothing is displayed in the UIImageView. I don't get any memory warnings or errors. Anybody knows what's going on?
-(void)URLSession:(NSURLSession *)session downloadTask:(NSURLSessionDownloadTask *)downloadTask didFinishDownloadingToURL:(NSURL *)location
{
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL:location]];
self.imageView.image = image;
self.progressView.progress = 0.0f;
[self.activityView stopAnimating];
[self.activityView setHidden:YES];
NSLog(#"Received full Image!"];
}
Your image just crashed the Chrome tab I tried to display it in on a 4GB laptop. It also caused my laptop to sputter quite a bit before it was saved.
It is a 36MB jpeg compressed bitmap. 18K x 18K pixels. Let's do some math.
Given the resolution, you have 324000000 pixels in all. 324M pixels. Each represented by 3 bytes of color info in memory. A good approximation would be 1GB of raw pixel data in all then. And I am not sure that takes into account how a UIImage instance stores that internally.
Given the above and the fact that the most recent (and powerful) iOS devices have 1GB of RAM in total, I think it would be safe to say that the image is too big even if there is not inherent size limit invovled.
I think you are trying to set your imageView content from a background thread. Any UI update must come for your main thread (main queue) in order to be safe. Try to set the UIImageView image property on your main queue, by doing the following:
-(void)URLSession:(NSURLSession *)session downloadTask:(NSURLSessionDownloadTask *)downloadTask didFinishDownloadingToURL:(NSURL *)location
{
UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfURL:location]];
dispatch_async(dispatch_get_main_queue(), ^{
self.imageView.image = image;
self.progressView.progress = 0.0f;
[self.activityView stopAnimating];
[self.activityView setHidden:YES];
});
NSLog(#"Received full Image!"];
}
EDIT: What I wrote above remains true, but as dandan78 pointed out your image is quite too big and the solution above is not likely to fix your problem since you can't overcome the de-facto memory limitations.
Make sure you are on main thread. UI cannot be changed from non-main thread. You can use
[self.imageView performSelectorOnMainThread:#selector(setImage:) withObject:image waitUntilDone:NO];
to ensure that.

How to load UIImageView objects in the background?

Is it safe to load UIImageView objects in the background, and when done, insert them in the view hierarchy on main thread?
For example you create a GCD block which loads 10 image views in background. At the end you have async block which adds all the UIImageViews to view hierarchy.
I heard if you create a UIImage and add it to a UIImageView then the image data gets loaded on demand when the UIImageView needs it. How would I force that the UIImage data gets pulled in the background so it doesnt block main thread for long loading time?
It is always better that if you are downloading images from server you download them on separate thread so It don't block your UI. Once image download complete you can set to that to a specific image view from main thread.
__weak typeof(self) weakSelf = self;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^(void) {
NSData *imageData= [NSData dataWithContentsOfURL:Image_URL];
UIImage *image = [UIImage imageWithData:imageData];
dispatch_sync(dispatch_get_main_queue(), ^(void) {
__strong __typeof__(weakSelf) strongSelf = weakSelf;
strongSelf.someImageView.image = image;
;
});
});
NOTE:
If you are using AFNetworking, you can use UIImageView category and it will handel loading image in background and also can cache it, so If you want to download again it will bring that image from cache.

imageWithCGImage: GCD memory issue

When I perform the following only on the main thread iref gets autoreleased immediately:
-(void)loadImage:(ALAsset*)asset{
#autoreleasepool {
ALAssetRepresentation* rep = [asset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
UIImage* image = [UIImage imageWithCGImage:iref
scale:[rep scale]
orientation:UIImageOrientationUp];
[self.imageView setImage:image];
}
}
But when I perform imageWithCGImage: with GCD on a background thread iref does not get released instantly like in the first example. only after about a minute:
-(void)loadImage:(ALAsset*)asset{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^(void) {
#autoreleasepool {
ALAssetRepresentation* rep = [asset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
UIImage* image = [UIImage imageWithCGImage:iref
scale:[rep scale]
orientation:UIImageOrientationUp];
dispatch_async(dispatch_get_main_queue(), ^(void) {
[self.imageView setImage:image];
});
}
});
}
How can I make the CGImageRef object to be released immediately?
Prior Research:
The Leak instrument doesnt show any leaks when I record with it.
The Allocations instruments shows that a CGImageRef object was allocated and is still living for about a minute after it should have been released.
If I try to manually release the CGImageRef object using CGImageRelease I get a BAD_EXEC exception after a minute when the image tries to get autoreleased.
retaining the iref using CGImageRetain and then using CGImageRelease to release it doesnt work.
similar questions on stackoverflow that didn't help: image loading with gcd, received memory warning in create image, memory leak when get fullscreenimage from alaseet result,
First off, there's not a notion of an "autoreleased" CF object. You can get into situations where such a thing exists when dealing with toll-free bridged classes, but as you can see, there's a CFRetain and a CFRelease but no CFAutorelease. So I think you're misconstruing the ownership of iref. Let's trace ownership throughout this code:
-(void)loadImage:(ALAsset*)asset{
asset is passed into this method. Its retain count is presumed to be at least 1.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^(void) {
The block closure takes a retain on asset.
#autoreleasepool {
ALAssetRepresentation* rep = [asset defaultRepresentation];
This returns you, by naming convention, an object that you don't own. It might be autoreleased, it might be a singleton/global, etc. But you don't own it and shouldn't take ownership of it by retaining it.
CGImageRef iref = [rep fullScreenImage];
Since there's not a notion of an "autoreleased" CF object, we'll presume that rep is returning you an inner pointer to a CGImageRef owned by rep. You also don't own this, and shouldn't retain it. Relatedly you don't control when it will go away. A reasonable guess would be that it will live as long as rep and a reasonable guess is that rep will live as long as asset so you should probably assume that iref will live at least as long as asset.
UIImage* image = [UIImage imageWithCGImage:iref
scale:[rep scale]
orientation:UIImageOrientationUp];
If the UIImage needs the CGImageRef to stick around, it will take a retain or make a copy to ensure that it stays alive. (Probably the latter.) The UIImage itself is autoreleased, by naming convention.
dispatch_async(dispatch_get_main_queue(), ^(void) {
This inner block closure is going to take a retain on image (and self). The block will be copied by libdispatch extending the life of those retains until the block is executed and itself released.
[self.imageView setImage:image];
The image view is going to take a retain (or copy) on image if it needs to, in order to do its job.
});
The inner block is done executing. At some point in the future, libdispatch will release it, which will transitively release the retains taken by the block closure on self and image.
}
Your autorelease pool pops here. Anything that was implicitly retain/autoreleased should be released now.
});
The outer block is done executing. At some point in the future, libdispatch will release it, which will transitively release the retain taken by the block closure on asset.
}
Ultimately, this method cannot control the lifetime of the CGImageRef in iref because it never has ownership of it. The implication here is that the CGImageRef is transitively owned by asset, so it will live at least as long as asset. Since asset is retained by virtue of being used in the outer block (i.e retained by the outer block's closure) and since libdispatch makes no promises about when finished blocks will be released, you effectively can't guarantee that iref will go away any sooner than libdispatch gets around to it.
If you wanted to go to manual retain/release, and be as explicit about it as possible, you could do this:
-(void)loadImage:(ALAsset*)asset{
__block ALAsset* weakAsset = [asset retain]; // asset +1
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^(void) {
#autoreleasepool {
ALAssetRepresentation* rep = [weakAsset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
UIImage* image = [[UIImage alloc] imageWithCGImage:iref
scale:[rep scale]
orientation:UIImageOrientationUp];
__block UIImage* weakImage = [image retain]; // image +1
[weakAsset release]; // asset -1
dispatch_async(dispatch_get_main_queue(), ^(void) {
[self.imageView setImage: weakImage];
[weakImage release]; // image -1
});
}
});
}
__block prevents the block closures from retaining asset and image allowing you to explicitly retain/release them yourself. This will mean that of all the things you create, all will be explicitly disposed. (rep and image are presumably retain/autoreleased, but your pool should take care of those.) I believe this is the most explicit you can be about this, given that asset is passed in to you, and therefore you don't control how long it lives, and it is ultimately the "owner" (as far as this scope is concerned) of the CGImageRef stored in iref.
Hope that clarifies things a bit.
You should retain the CGImageRef
..
CGImageRef iref = CGImageRetain([rep fullScreenImage]);
//..before [self.imageView..
CGImageRelease(iref)
The rest is just a matter of run loop, in one without GCD the image is released istanneously, in the other is managed by GCD, but is wrong anyway, someone has to take the ownership of iref.

How to asynchronously load an image in an UIImageView?

I have an UIView with an UIImageView subview. I need to load an image in the UIImageView without blocking the UI. The blocking call seems to be: UIImage imageNamed:. Here is what I thought solved this problem:
-(void)updateImageViewContent {
dispatch_async(
dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
UIImage * img = [UIImage imageNamed:#"background.jpg"];
dispatch_sync(dispatch_get_main_queue(), ^{
[[self imageView] setImage:img];
});
});
}
The image is small (150x100).
However the UI is still blocked when loading the image. What am I missing ?
Here is a small code sample that exhibits this behaviour:
Create a new class based on UIImageView, set its user interaction to YES, add two instances in a UIView, and implement its touchesBegan method like this:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
if (self.tag == 1) {
self.backgroundColor= [UIColor redColor];
}
else {
dispatch_async(dispatch_get_main_queue(), ^{
[self setImage:[UIImage imageNamed:#"woodenTile.jpg"]];
});
[UIView animateWithDuration:0.25 animations:
^(){[self setFrame:CGRectInset(self.frame, 50, 50)];}];
}
}
Assign the tag 1 to one of these imageViews.
What happens exactly when you tap the two views almost simultaneously, starting with the view that loads an image? Does the UI get blocked because it's waiting for [self setImage:[UIImage imageNamed:#"woodenTile.jpg"]]; to return ? If so, how may I do this asynchronously ?
Here is a project on github with ipmcc code
Use a long press then drag to draw a rectangle around the black squares. As I understand his answer, in theory the white selection rectangle should not be blocked the first time the image is loaded, but it actually is.
Two images are included in the project (one small: woodenTile.jpg and one larger: bois.jpg). The result is the same with both.
Image format
I don't really understand how this is related to the problem I still have with the UI being blocked while the image is loaded for the first time, but PNG images decode without blocking the UI, while JPG images do block the UI.
Chronology of the events
The blocking of the UI begins here..
.. and ends here.
AFNetworking solution
NSURL * url = [ [NSBundle mainBundle]URLForResource:#"bois" withExtension:#"jpg"];
NSURLRequest * request = [NSURLRequest requestWithURL:url];
[self.imageView setImageWithURLRequest:request
placeholderImage:[UIImage imageNamed:#"placeholder.png"]
success:^(NSURLRequest *request, NSHTTPURLResponse *response, UIImage *image) {
NSLog(#"success: %#", NSStringFromCGSize([image size]));
} failure:^(NSURLRequest *request, NSHTTPURLResponse *response, NSError *error) {
NSLog(#"failure: %#", response);
}];
// this code works. Used to test that url is valid. But it's blocking the UI as expected.
if (false)
if (url) {
[self.imageView setImage: [UIImage imageWithData:[NSData dataWithContentsOfURL:url]]]; }
Most of the time, it logs: success: {512, 512}
It also occasionnaly logs: success: {0, 0}
And sometimes: failure: <NSURLResponse: 0x146c0000> { URL: file:///var/mobile/Appl...
But the image is never changed.
The problem is that UIImage doesn't actually read and decode the image until the first time it's actually used/drawn. To force this work to happen on a background thread, you have to use/draw the image on the background thread before doing the main thread -setImage:. This worked for me:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0), ^{
UIImage * img = [UIImage imageNamed:#"background.jpg"];
// Make a trivial (1x1) graphics context, and draw the image into it
UIGraphicsBeginImageContext(CGSizeMake(1,1));
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextDrawImage(context, CGRectMake(0, 0, 1, 1), [img CGImage]);
UIGraphicsEndImageContext();
// Now the image will have been loaded and decoded and is ready to rock for the main thread
dispatch_sync(dispatch_get_main_queue(), ^{
[[self imageView] setImage: img];
});
});
EDIT: The UI isn't blocking. You've specifically set it up to use UILongPressGestureRecognizer which waits, by default, a half a second before doing anything. The main thread is still processing events, but nothing is going to happen until that GR times out. If you do this:
longpress.minimumPressDuration = 0.01;
...you'll notice that it gets a lot snappier. The image loading is not the problem here.
EDIT 2: I've looked at the code, as posted to github, running on an iPad 2, and I simply do not get the hiccup you're describing. In fact, it's quite smooth. Here's a screenshot from running the code in the CoreAnimation instrument:
As you can see on the top graph, the FPS goes right up to ~60FPS and stays there throughout the gesture. On the bottom graph, you can see the blip at about 16s which is where the image is first loaded, but you can see that there's not a drop in the frame rate. Just from visual inspection, I see the selection layer intersect, and there's a small, but observable delay between the first intersection and the appearance of the image. As far as I can tell, the background loading code is doing its job as expected.
I wish I could help you more, but I'm just not seeing the problem.
You can use AFNetworking library , in which by importing the category
"UIImageView+AFNetworking.m"
and by using the method as follows :
[YourImageView setImageWithURL:[NSURL URLWithString:#"http://image_to_download_from_serrver.jpg"]
placeholderImage:[UIImage imageNamed:#"static_local_image.png"]
success:^(NSURLRequest *request, NSHTTPURLResponse *response, UIImage *image) {
//ON success perform
}
failure:NULL];
hope this helps .
I had a very similar issue with my application where I had to download lot of images and along with that my UI was continuously updating. Below is the simple tutorial link which resolved my issue:
NSOperations & NSOperationQueues Tutorial
this is the good way:
-(void)updateImageViewContent {
dispatch_async(dispatch_get_main_queue(), ^{
UIImage * img = [UIImage imageNamed:#"background.jpg"];
[[self imageView] setImage:img];
});
}
Why don't you use third party library like AsyncImageView? Using this, all you have to do is declare your AsyncImageView object and pass the url or image you want to load. An activity indicator will display during the image loading and nothing will block the UI.
-(void)touchesBegan: is called in the main thread. By calling dispatch_async(dispatch_get_main_queue) you just put the block in the queue. This block will be processed by GCD when the queue will be ready (i.e. system is over with processing your touches). That's why you can't see your woodenTile loaded and assigned to self.image until you release your finger and let GCD process all the blocks that have been queued in the main queue.
Replacing :
dispatch_async(dispatch_get_main_queue(), ^{
[self setImage:[UIImage imageNamed:#"woodenTile.jpg"]];
});
by :
[self setImage:[UIImage imageNamed:#"woodenTile.jpg"]];
should solve your issue… at least for the code that exhibits it.
Consider using SDWebImage: it not only downloads and caches the image in the background, but also loads and renders it.
I've used it with good results in a tableview that had large images that were slow to load even after downloaded.
https://github.com/nicklockwood/FXImageView
This is an image view which can handle background loading.
Usage
FXImageView *imageView = [[FXImageView alloc] initWithFrame:CGRectMake(0, 0, 100.0f, 150.0f)];
imageView.contentMode = UIViewContentModeScaleAspectFit;
imageView.asynchronous = YES;
//show placeholder
imageView.processedImage = [UIImage imageNamed:#"placeholder.png"];
//set image with URL. FXImageView will then download and process the image
[imageView setImageWithContentsOfURL:url];
To get an URL for your file you might find the following interesting:
Getting bundle file references / paths at app launch
When you are using AFNetwork in an application, you do not need to use any block for load image because AFNetwork provides solution for it. As below:
#import "UIImageView+AFNetworking.h"
And
Use **setImageWithURL** function of AFNetwork....
Thanks
One way i've implemented it is the Following: (Although i do not know if it's the best one)
At first i create a queue by using Serial Asynchronous or on Parallel Queue
queue = dispatch_queue_create("com.myapp.imageProcessingQueue", DISPATCH_QUEUE_SERIAL);**
or
queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH,0);
**
Which ever you may find better for your needs.
Afterwards:
dispatch_async( queue, ^{
// Load UImage from URL
// by using ImageWithContentsOfUrl or
UIImage *imagename = [UIImage imageWithData:[NSData dataWithContentsOfURL:url]];
// Then to set the image it must be done on the main thread
dispatch_sync( dispatch_get_main_queue(), ^{
[page_cover setImage: imagename];
imagename = nil;
});
});
There is a set of methods introduced to UIImage in iOS 15 to decode images and create thumbnails asynchronously on background thread
func prepareForDisplay(completionHandler: (UIImage?) -> Void)
Decodes an image asynchronously and provides a new one for display in views and animations.
func prepareThumbnail(of: CGSize, completionHandler: (UIImage?) -> Void)
Creates a thumbnail image at the specified size asynchronously on a background thread.
You can also use a set of similar synchronous APIs, if you need more control over where you want the decoding to happen, e.g. specific queue:
func preparingForDisplay() -> UIImage?
func preparingThumbnail(of: CGSize) -> UIImage?

Resources