I am using "SVProgressHUD" for loader. When I want to load an image from url I am using async process to load the image in background. After completing the download I am updating the UI. And for that time span I am using a placeholder on the imageview and a loader with "Please wait..". But I want to use a loader like "Instagram". That means, when an image is loading from online it loads a blurry image of the original image and upon downloading the image , the blurry image show the original image.
Can any one please suggest me how can I do this?
The stuff you are talking about is Progressive JPEG.
Progressive JPEG (PJPEG) is an image format that stores multiple,
individual “scans” of a photo, each with an increasing level of
detail. When put together, the scans create a full-quality image. The
first scan gives a very low-quality representation of the image, and
each following scan further increases the level of detail and quality.
When images are downloaded using PJPEG, we can render the image as
soon as we have the first scan. As later scans come through, we update
the image and re-render it at higher and higher quality.
So, you have to make images loadable in this modern progressive format.You can follow a approach where every image before saving on server just convert it into appropriate progressive format.There are several tools avaiable for different type of server. E.g jpegtran
To check whether image is progressive or not you can use this tool or tool2.You can search for online tool a lot of tool is available.
Now for iOS
You can follow this tutorial
Some library
Concorde
DFImageManager
Assuming you are in control of the hosting of the images you can host lower resolution images and make 2 fetches, one for the lower resolution, and one for the higher. If you make the lower resolution image significantly smaller in file size the fetch for it will finish before the fetch for the full image, and you populate the UIImageView with this image until such a time as you can replace it.
Purely example code would look like this:
let imageView = UIImageView()
let lowResOperation = NSBlockOperation {
guard let imageData = NSData(contentsOfURL: NSURL(string: "https://myhost.org/low-res-image")!),
image = UIImage(data: imageData) else { return }
if imageView.image == nil {
imageView.image = image
}
}
let highResOperation = NSBlockOperation {
guard let imageData = NSData(contentsOfURL: NSURL(string: "https://myhost.org/high-res-image")!),
image = UIImage(data: imageData) else { return }
imageView.image = image
}
let backgroundQueue = NSOperationQueue()
backgroundQueue.addOperations([lowResOperation, highResOperation], waitUntilFinished: true)
Related
I'm creating an application in ios where I load images from an api using a UITableView and UITableViewCell.
Since the UITableView reuses cells, old images were appearing when I scroll fast. In order to prevent this, I set a default image using a system image(SF symbols).
I also use a cache to store urls to images.
Everything works as it should but now I think of it I'm sending a network request to retrieve that systemImage each time which seems incredibly inefficient since I was using a cache in order to reduce the total network calls in the first place.
Is there way around this or is this a tradeoff I must make?
Code is below.
//use default image from SF symbols
let defaulticon = UIImage(systemName: "photo")?.withTintColor(.gray, renderingMode: .alwaysOriginal)
DispatchQueue.main.async {
cell.mealImage.image = defaulticon
}
guard cell.meal?.strMealThumb != nil else {
print("Category Image doesn't exist")
return
}
//use cache
if let imageData = model.imagecache.object(forKey: cell.meal!.strMealThumb as NSString) {
print("using cache")
DispatchQueue.main.async {
cell.mealImage.image = imageData
}
}
else {
let url = URL(string: cell.meal!.strMealThumb)
let session = URLSession.shared.dataTask(with: url!) { data, response, error in
if error == nil && data != nil {
let image = UIImage(data: data!)
//self.model.imagecache[cell.meal!.strMealThumb] = image
self.model.imagecache.setObject(image!, forKey: cell.meal!.strMealThumb as NSString)
DispatchQueue.main.async {
cell.mealImage.image = image
}
}
}
session.resume()
}
}
Override prepareForReuse method in UITableViewCell and add code in this function to clean up unrequited data that could persist from previous usage of the cell. In your example assign the default image in this function to produce better result.
You asked:
I set a default image using a system image(SF symbols).
...
Everything works as it should but now I think of it I'm sending a network request to retrieve that systemImage each time which seems incredibly inefficient since I was using a cache in order to reduce the total network calls in the first place.
No, UIImage(systemName:) does not make a network request. And it caches the image, itself, as the documentation says:
This method checks the system caches for an image with the specified name and returns the variant of that image that is best suited for the main screen. If a matching image object is not already in the cache, this method creates the image from the specified system symbol image. The system may purge cached image data at any time to free up memory. Purging occurs only for images that are in the cache but are not currently being used.
FWIW, you can empirically verify that this does not perform a network request disconnecting from the network and trying to use it. You will see it works fine, even when disconnected.
FWIW, there is a very small performance gain (less than a millisecond?) by keeping a reference to that tinted system image and reusing it, rather than fetching the cached system image and re-tinting it. But the performance improvement is negligible.
I am getting an Image as Data from server.To show it I am using UIImage?(data:) to show it.For a specific image application is getting crash because of memory pressure. What could be the reason, how to fix it.
if let imgData = Utils.fetchDataFromDocumentDirectory(imageName:"test.jpg"){
attachmentImgView.image = UIImage(data:imgData)
}
How big is the Image? and are you downloading only 1 image or an array?
This might be caused from the size of the image you are trying to download.
My advice if you are trying to show an image, only giving it the url like so:
let url = URL(string: "http://i.imgur.com/w5rkSIj.jpg")
let data = try? Data(contentsOf: url)
if let imageData = data {
let image = UIImage(data: data)
}
even better you can try to use AsyncImageView 3rd party library where you give it the url of the image and it shows it asynchronously without delays in the app.
Hope this helps!
i downloaded images from the server with alamofire framework. I worked with Alamofire.download . Each image has size above 1MB +-, but after each request memory increase a lot. After downloading 4 images the memory used is above 171MB and after that each image give more than 35MB.
Downloading code is:
Alamofire.download(mainReguest, to: self.destination)
.downloadProgress{ progress in
self.progressView.progress = Float(progress.fractionCompleted)
}
.response{ response in
if response.error == nil, let imagePath = response.destinationURL?.path {
let image = UIImage(contentsOfFile: imagePath)
self.addNewImageToTheScrollView(img: image)
}
}
}
Code with destination is:
let destination = DownloadRequest.suggestedDownloadDestination(for: .documentDirectory,
in: .userDomainMask,
with: [DownloadRequest.DownloadOptions.removePreviousFile])
The problem is with this...
UIImage(contentsOfFile: imagePath)
Specifically with the fact that you are (presumably) downloading a compressed format of your image.
However, UIImage is uncompressed in memory. So each pixel of your image will take 4 bytes of information (red, green, blue, alpha).
So if you have an image that is (for example) 5 mega pixels.
Then 5,000,000 pixels * 4b = 20MB.
So, I imagine your images are around 10MP?
The best way around this is to optimise your image download. If you're displaying an image on an iPhone then there is no point downloading a 10MP version of it. You might as well resize it to be much much smaller.
Ideally, you should be resizing the images on the backend before the download happens.
Your Alamofire's code is ok. Most likely you have issues somewhere in UI part, for example because of the high image resolution. The first thing you have to do is to localise an issue. To check networking code please comment all UI related code and run your app one more time.
Alamofire.download(mainReguest, to: self.destination)
.downloadProgress{ progress in
// self.progressView.progress = Float(progress.fractionCompleted)
}
.response{ response in
if response.error == nil, let imagePath = response.destinationURL?.path {
let image = UIImage(contentsOfFile: imagePath)
print("Image is successfully downloaded!")
//self.addNewImageToTheScrollView(img: image)
}
}
}
I'm trying to compress and get the NSdata from between 20 and 30 UIImages with a "for-loop" like this:
for theImage in selectedUIImages {
let data = UIImageJPEGRepresentation(image,0.5)
// doing something with the data
}
Tried on an iPhone 7 with no issues besides my app using upto 700MB of memory when going through the loop, but on an older iPhone I get the message:
*Message from debugger: Terminated due to memory issue.*
The main objective is to get the NSData from the UIImage so I can put the image in a dir for uploading. Let me explain:
The Amazon S3 Transfer utility wants a path/url to the image and therefore I need to make a path/url for the UIImage and the only way i know is to get it by:
data.write(to: URL(fileURLWithPath: localPath), options: .atomic)
Try using an autorelease pool:
autoreleasepool {
for theImage in selectedUIImages {
let data = UIImageJPEGRepresentation(image,0.5)
// doing something with the data
}
}
and move it in a background thread.
Because your app run out of memory.
You can save it to Document directory after compress then upload it to server one by one. So it not make your memory issue.
You can decrease the image size by decreasing a ratio parameter. You can use 0.3 instead 0.5.
for theImage in selectedUIImages {
let data = UIImageJPEGRepresentation(image,0.3)
// doing something with the data
}
I am trying to measure the time taken to load a large photo (JPEG) from file into an UIImageView on iOS 8.0.
My current code:
import UIKit
class ViewController: UIViewController {
#IBOutlet weak var imageView: UIImageView!
#IBAction func loadImage(sender: UIButton) {
if let imageFile = NSBundle.mainBundle().pathForResource("large_photo", ofType: "jpg") {
// start our timer
let tick = Tick()
// loads a very large image file into imageView
// the test photo used is a 4608 × 3456 pixel JPEG
// using contentsOfFile: to prevent caching while testing timer
imageView.image = UIImage(contentsOfFile: imageFile)
// stop our timer and print execution time
tick.tock()
}
}
}
class Tick {
let tickTime : NSDate
init () {
tickTime = NSDate()
}
func tock () {
let tockTime = NSDate()
let executionTime = tockTime.timeIntervalSinceDate(tickTime)
println("[execution time]: \(executionTime)")
}
}
When I load a very large image (4608 x 3456 JPEG) on my test device (5th gen iPod touch), I can see that the execution time is ~2-3 seconds and blocks the main thread. This is observable by the fact that the UIButton remains in a highlighted state for this period of time and no other UI elements allow interaction.
I would therefore expect my timing function to report a time of ~2-3 seconds. However, it reports a time of milliseconds - eg:
[execution time]: 0.0116159915924072
This tick.tock() prints the message to the Console before the image is loaded. This confuses me, as the main thread appears blocked until after the image is loaded.
This leads me to ask the following questions:
if the image is being loaded asynchronously in the background, then
why is user interaction/main thread blocked?
if the image is being loaded on the main thread, why does the
tick.tock() function print to the console before the image is
displayed?
There are 2 parts to what you are measuring here:
Loading the image from disk:
UIImage(contentsOfFile: imageFile)
And decompressing the image from a JPEG to a bitmap to be displayed:
imageView.image = ....
The first part involves actually retrieving the compressed JPEG data from the disk (disk I/O) and creating a UIImage object. The UIImage object holds a reference to the compressed data, until it needs to be displayed. Only at the moment that it's ready to be rendered to the screen does it decompress the image into a bitmap to display (on the main thread).
My guess is that your timer is only catching the disk load part, and the decompression is happening on the next runloop. The decompression of an image that size is likely to take a while, probably the lions share of the time.
If you want to explicitly measure how long the decompression takes, you'll need to do it manually, by drawing the image to an off screen context, like so:
let tick = Tick()
// Load the image from disk
let image = UIImage(contentsOfFile: imageFile)
// Decompress the image into a bitmap
var newImage:UIImage;
UIGraphicsBeginImageContextWithOptions(image.size, true, 0);
image.drawInRect(CGRect(x:0,y:0,width:image.size.width, height:image.size.height))
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
tick.tock()
Here we are replicating the decompression that would happen when you assigned the image to the imageView.image
A handy trick to keep the UI responsive when dealing with images this size is to kick the whole process onto a background thread. This works well because once you have manually decompressed the image, UIKit detects this and doesn't repeat the process.
// Switch to background thread
dispatch_async(dispatch_get_global_queue(Int(DISPATCH_QUEUE_PRIORITY_DEFAULT.value), 0)) {
// Load the image from disk
let image = UIImage(contentsOfFile: imageFile)
// Ref to the decompressed image
var newImage:UIImage;
// Decompress the image into a bitmap
UIGraphicsBeginImageContextWithOptions(image.size, true, 0);
image.drawInRect(CGRect(x:0,y:0,width:image.size.width, height:image.size.height))
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Switch back to main thread
dispatch_async(dispatch_get_main_queue()) {
// Display the decompressed image
imageView.image = newImage
}
}
A disclaimer: The code here has not been fully tested in Xcode, but it's 99% correct if you decide to use it.
I would try to time this using a unit test, since the XCTest framework provides some good performance measurement tools. I think this approach would get around the lazy loading issues... although I'm not 100% on it.
func testImagePerformance() {
let date = NSDate()
measureBlock() {
if let imageFile = NSBundle.mainBundle().pathForResource("large_photo", ofType: "jpg") {
imageView.image = UIImage(contentsOfFile: imageFile)
}
}
}
(Just an aside, you mentioned that the loading blocks the main app thread... you should look into using an NSOperationQueue to make sure that doesn't happen... you probably already know that: http://nshipster.com/nsoperation/)