How to cache images only in disk using Kingfisher? - ios

I am using Kingfisher library for downloading and caching images. I am facing some issues in the implementation:
Are the images cached in both memory and disk?
Is there any provision to cache images only on disk?
I have already read multiple posts regarding this, but couldn't find any solution.

Yes, Kingfisher caches images both in memory and on disk.
By default, the amount of RAM that will be used is not even limited, you have to set the value yourself:
ImageCache.default.maxMemoryCost = 1024 * 1024 * yourValue
where 1024 * 1024 * yourValue is the global cost in megapixels (I know this is weird, but it's not megabytes, it's megapixels, because images can have different bit depth, etc).
For example, in my tests, the maximum RAM used with a value of 1024 * 1024 * 500 fluctuates between 120MB and 300MB.
Incidentally, this is also how you tell Kingfisher to never use the RAM and only cache to disk:
ImageCache.default.maxMemoryCost = 1
This will force Kingfisher to only use the disk cache.
How to debug
The first thing is to check that you're setting the max value on the right cache. Maybe you did create a custom cache? My example is setting the value for the default cache, the one used if no other is defined.
You may also want to manually clear the memory cache and compare the RAM occupation before and after:
ImageCache.default.clearMemoryCache()
If you think that some big image is in the memory cache when it shouldn't be, you can verify with isImageCached:
if let result = ImageCache.default.isImageCached(forKey: imageLink) {
print(result.cached)
print(result.cacheType)
}

If anyone looking for answer for downloading images explicitly and caching the same without using imageView the sample code is:
ImageDownloader.default.downloadImage(with: imgUrl, retrieveImageTask: nil, options: [], progressBlock: nil) { (image, error, url, data) in
print("Downloaded Image: \(url)")
//cache image:
if let image = image, let url = url {
ImageCache.default.store(image, forKey: url.absoluteString)
}
}
reference: https://github.com/onevcat/Kingfisher/wiki/Cheat-Sheet

Swift 5.3, Xcode 12
https://stackoverflow.com/a/44354411/10579134 , the latest version of the following
ImageCache.default.memoryStorage.config.totalCostLimit = 1 //1 in bytes

Related

WebP encodedData loads for 30+ seconds

iOS version: 13.1
iPhone: X
I'm currently using DBAttachmentPickerController to choose from a variety of images, the problem comes when I take a picture directly from the camera and try to upload it to our server. The SDImageWebPCoder.shared.encodedData loads for about 30 seconds more less. The same image in the Android app takes about 2-3 seconds.
Here is the code I use
let attachmentPickerController = DBAttachmentPickerController(finishPicking: { attachmentArray in
self.images = attachmentArray
var currrentImage = UIImage()
self.images[0].loadOriginalImage(completion: { image in
self.userImage.image = image
currrentImage = image!
})
//We transform it to webP
let webpData = SDImageWebPCoder.shared.encodedData(with: currrentImage, format: .webP, options: nil)
self.api.editImageUser(data: webpData!)
}, cancel: nil)
attachmentPickerController.mediaType = DBAttachmentMediaType.image
attachmentPickerController.allowsSelectionFromOtherApps = true
attachmentPickerController.present(on: self)
Should I change the Pod I'm using? Should I just compress it? Or am I doing something wrong?
WebP encoding speed is related slow, it use software encoding and VP8 compression algorithm (complicated), compared to the Hardware accelerated JPEG/PNG encoding. (Apple's SoC).
picture directly from the camera
The original image taken on iPhone camera may be really lark, like 4K resolution. If you don't do some pre-scale and try to encode it, you may consume much more time.
The suggestion can be like this:
Try to use the options like the compressionQuality, the higher cost
more time, but compress more.By default it's 1.0, which is the higest and most time consuming.
Try to pre-scale the original image. For image from Photos Libraray, you can always use the API to control the size. Or, you can use SDWebImage's transform method like - [UIImage sd_resizedImage:].
Do all the encoding in background thread, never block main thread
If all these is not suitable, the better solution, it's to use JPEG and PNG format instead of WebP. Then, on your image server side code, transcoding the JPEG/PNG to WebP. Server side processing is always the best idea for this thing.
If you're intersted the real benchmark or something, compared to JPEG/PNG (Hardware) and WebP (Software). You can try to use my benchmark code demo here, to help you do your decision.
https://github.com/dreampiggy/ModernImageFormatBenchmark

Storage downloading costs too much for Firestore app in Swift 4

So the way my app is working is kind of like instagram. A user can upload a photo, and whenever someone loads the app it downloads each picture that was uploaded from the firebase.
I understand that I need to buy space or change my plan, but I didn't do that much and I'm wasting 1.7gb from a user in like an hour. Each photo costs like 17mb to upload and download.
I am not sure what I can do to lessen my downloading here.
The way I download from firestore is like this from the f:
// Create a reference to the file you want to download
let islandRef = storageRef.child("images/island.jpg")
// Download in memory with a maximum allowed size of 1MB (1 * 1024 * 1024 bytes)
islandRef.getData(maxSize: 1 * 10240 * 10240) { data, error in
if let error = error {
// Uh-oh, an error occurred!
} else {
// Data for "images/island.jpg" is returned
let image = UIImage(data: data!)
}
}
And each time it loads a photo into a collectionviewcontroller. Which means it is like 17mb for each photo which is a lot. Any suggestions? Thanks
So this is where you want to make a decision about the level of quality for the photos that you upload to firebase. I can assure you that instagram and any other social media platform only store versions of your pictures that are compressed and optimized for size.
You can easily compress your image by doing something like this
let data = imageToUpload.jpegData(compressionQuality: 0.3)
you would then upload that new compressed version of the image to firebase and dramatically improve your storage efficiency.

AlamofireImage cache?

I use AlamofireImage in conjunction with PromiseKit and Alamofire. I use promises to chain the download of x number of images since I want to do this synchronically. I can't seem to understand if ImageDownloader() caches automatically or if I have to add the image explicitly to the cache? The only examples I've seen so far are not using the ImageDownloader so I have a real hard time finding an answer to this.
If not - how do I add it the cache? I've tried using:
self.imageDownloader.imageCache?.addImage(image, withIdentifier: imageUrl)
But all it does is increase my memory usage for all eternity(i.e. adding the same image to the cache over and over)
I think that defining an AutoPurgingImageCache() and then using it while caching process need to solve your memory usage problem.
let photoCache = AutoPurgingImageCache(
memoryCapacity: 100 * 1024 * 1024,
preferredMemoryUsageAfterPurge: 60 * 1024 * 1024
)
You can change memoryCapacity, preferredMemoryUsageAfterPurge represent MB. To add any image in your cache, you can use like that:
photoCache.addImage(scaledImage, withIdentifier: urlString)
Also you can check this topic for more details and AlamofireImage GitHub page.
I was also using the same approach you did here. But according to the API reference the image should be added to cache automatically:
Each downloaded image is cached in the underlying NSURLCache as well as the in-memory image cache that supports image filters.
https://alamofire.github.io/AlamofireImage/Classes/ImageDownloader.html

Gif using UIImage+animtedGif class

Using this class I am trying to load a gif url into UIImageView.
The thing is , for some url's it takes 10 seconds to load, others 2 seconds.
I have tried almost anything, but still the process is too slow. 1 second would be good, but i had never succeed getting there.
I have also tried with UIWebview which had its own issues .
Here is the code :
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)) {
let fileUrl = NSURL(string:"http://45.media.tumblr.com/6785bae27b8f888fe825f0ade95796a3/tumblr_noenkbeTSw1qjmwryo1_500.gif" )
let gif = UIImage.animatedImageWithAnimatedGIFURL(fileUrl!)
dispatch_async(dispatch_get_main_queue()) {
self.player.image = gif
}
}
The problem with most of the GIF reading tools I have looked at is that they read all the data in at load time and that they allocate memory for all of the decoded frames and hold all that uncompressed data in memory at the same time. This will lead to runtime performance problems and it will crash your app and possibly your device on large/long gifs. On the issue of loading time, there is not all that much you can do since the data does need to be downloaded and read. You are also just assuming that the network cache is going to handle hitting the same GIF over and over without going to the network again, which may or may not work well for you. For a solution that addresses these issues, see this SO Question or you can also take a look at the flipboard solution here.

Determine memory limit of iOS today extension

I'm developing an iOS today extension, that can read an image from UIPasteboard and save it on disk. This process fails with large images because iOS extensions can't use much memory. To workaround this issue, I'm checking the size of the image first and try to decide, if the widget can save it or should delegate this task to its host app:
let MAXIMUM_IMAGE_SIZE_BYTES = <SomeMagicNumber>
if let clipboardImage = UIPasteboard.generalPasteboard().image {
let imageSize = CGImageGetHeight(clipboardImage.CGImage) * CGImageGetBytesPerRow(clipboardImage.CGImage)
if imageSize > MAXIMUM_IMAGE_SIZE_BYTES {
// Open host app to save image
}
else {
// Save image directly
}
}
I have the following questions:
Is my size calculation correct? I took it from this thread. I cannot instantiate a JPEG or PNG representation and read its size because of the memory limitations mentioned above.
Can I get rid of that magic number for the maximum image size in bytes? If not, are there any official specifications from Apple that I can use? I cannot test my app on every available iOS model and don't want to risk crashes on older devices.
Thanks a lot for your help!
I'm just starting to look at the memory that a notification service extension is using. I found this presentation. Might be helpful for others.
https://cocoaheads.tv/memory-use-in-extensions-by-conrad-kramer/
What was your solution to this issue?

Resources