Rare crashes when setting UIImageView with a UIImage backed with CIImage - ios

First of all, I want to emphasize that this bug concerns only about 1% of the user base according to Firebase Crashlytics.
I have a xcasset catalog with many heic images.
I need to display some of those images as such (original version) and some of them blurred.
Here is the code to load and display a normal image or a blurred image.
// Original image
self.imageView.image = UIImage(named: "officeBackground")!
// Blurred image
self.imageView.image = AssetManager.shared.blurred(named: "officeBackground")
I use a manager to cache the blurred images so that I don't have to re-generate them every time I display them.
final class AssetManager {
static let shared = AssetManager()
private var blurredBackground = [String: UIImage]()
func blurred(named: String) -> UIImage {
if let cachedImage = self.blurredBackground[from] {
return cachedImage
}
let blurred = UIImage(named: named)!.blurred()!
self.blurredBackground[from] = blurred
return blurred
}
}
And finally the blur code
extension UIImage {
func blurred() -> UIImage? {
let ciimage: CIImage? = self.ciImage ?? CIImage(image: self)
guard let input = ciimage else { return nil }
let blurredImage = input.clampedToExtent()
.applyingFilter("CIGaussianBlur", parameters: [kCIInputRadiusKey: 13])
.cropped(to: input.extent)
return UIImage(ciImage: blurredImage, scale: self.scale, orientation: .up)
}
}
And here are the 2 types of crashes I get
CoreFoundation with CFAutorelease. Crashlytics has an additional info about it:
crash_info_entry_0:
*** CFAutorelease() called with NULL ***
CoreImage with recursive_render. Crashlytics has also this additional info about it:
crash_info_entry_0:
Cache Stats: count=14 size=100MB non-volatile=0B peakCount=28 peakSize=199MB peakNVSize=50MB
The only common point I found between all users is that they have between 30 - 150 Mo of RAM at the time of crash (according to Firebase, if this info is even reliable?).
At this point, I am honestly clueless. It seems like a bug with CoreImage / CoreFoundation with how it handles CIImage in memory.
The weird thing is that because I'm using the AssetManager to cache the blurred images, I know that during the time of crash the user already has a cache version available in RAM, and yet when setting the UIImageView with the cached image, it crashes because of low memory (?!). Why is the system even trying to allocate memory to do this?

In my experience, using a UIImage that is created from a CIImage directly is very unreliable and buggy. The main reason is that a CIImage is not really a bitmap image, but rather a receipt that contains the instructions for creating an image. It is up to the consumer of the UIImage to know that it's backed by a CIImage and render it properly. UIImageView theoretically does that, but I've seen many reports here on SO that it's somewhat unreliable. And as ohglstr correctly pointed out, caching that UIImage doesn't help much since it still needs to be rendered every time it's used.
I recommend you use a CIContext to render the blurred images yourself and cache the result. You could for instance do that in your AssetManager:
final class AssetManager {
static let shared = AssetManager()
private var blurredBackground = [String: UIImage]()
private var ciContext: CIContext()
func blurred(named name: String) -> UIImage {
if let cachedImage = self.blurredBackground[name] {
return cachedImage
}
let ciImage = UIImage(named: name)!.blurred()!
let cgImage = self.ciContext.createCGImage(ciImage, from: ciImage.extent)!
let blurred = UIImage(cgImage: cgImage)
self.blurredBackground[name] = blurred
return blurred
}
}

Related

How performant is CIFilter for displaying several grayscale images in a tableview?

I am using the following to convert images to grayscale before showing them on a UITableView using UIImageView:
extension UIImage {
var noir: UIImage? {
let contextForGrayscale = CIContext(options: nil)
guard let currentFilter = CIFilter(name: "CIPhotoEffectNoir") else { return nil }
currentFilter.setValue(CIImage(image: self), forKey: kCIInputImageKey)
if let output = currentFilter.outputImage,
let cgImage = contextForGrayscale.createCGImage(output, from: output.extent) {
return UIImage(cgImage: cgImage, scale: scale, orientation: imageOrientation)
}
return nil
}
}
Since I am showing these images in a UITableView using UIImageView, each image is being grayscaled as the user scrolls. On my iPhone 13, the performance seems very good and I don't see any lag. However, I am curious how good its performance is on an old device. I don't have an old device, so I am unable to test it.
Is this a performant way to grayscale on the fly and display them? Is there anything I can do to make it better?
Is there a way to make my phone slower for testing the performance? Sort of like simulate older device?
If performance / memory pressure doesn't seem to be an issue, I'd just not worry about it. If it is a problem you could use NSCache.
I'd do the caching outside the extension, but for the sake of the code example:
extension UIImage {
private static var noirCache = NSCache<NSString, UIImage>()
func makeNoirImage(identifier: String) -> UIImage? {
if let cachedImage = UIImage.noirCache.object(forKey: identifier as NSString) {
return cachedImage
}
let contextForGrayscale = CIContext(options: nil)
guard let currentFilter = CIFilter(name: "CIPhotoEffectNoir") else { return nil }
currentFilter.setValue(CIImage(image: self), forKey: kCIInputImageKey)
if let output = currentFilter.outputImage,
let cgImage = contextForGrayscale.createCGImage(output, from: output.extent) {
let noirImage = UIImage(cgImage: cgImage, scale: scale, orientation: imageOrientation)
UIImage.noirCache.setObject(noirImage, forKey: identifier as NSString)
}
return nil
}
}
Also, check out this article https://nshipster.com/image-resizing/
You could, in addition to creating this new image, also create a thumbnail right sized for its display image view and use the built in caching mechanisms for this. This would save some memory and performance overall. But again, if it's not an issue, I'd be happier to just have the simpler code and no caching!
Oh, one more thing. You could use this https://developer.apple.com/documentation/uikit/uitableviewdatasourceprefetching to create the image ahead of time async before the cell is displayed, so it's ready to go by the time the table asks for the cell for the given index path. Thinking about it this is probably the simplest / nicest solution here.

How to Flush Cache to Free up Memory when using UIImage - Swift 3.0

After reading a number of answers on SO and other articles (see below) what is the best method to manage memory when you are loading multiple images into an animation array?
http://www.alexcurylo.com/2009/01/13/imagenamed-is-evil/
Here are my goals:
Create Animation Array (see code below)
After the Animation plays flush the cache and load another animation (rinse, wash, repeat, you get it :)
As Apple states
https://forums.developer.apple.com/thread/17888
Using the UIImage(named: imageName) caches the images, but in my case after playing 2-3 animations in a row iOS terminates the App (the OS does not respond to a low memory warning and instead terminates the App - see end of code below)
I don't want to cache the images and rather we could either:
Flush the memory each time an animation completes and then load a new animation; or
Flush the memory when the user moves to a new Scene
Here is my code:
// create the Animation Array for each animation
func createImageArray(total: Int, imagePrefix: String) -> [UIImage]{
var imageArray: [UIImage] = []
for imageCount in 0..<total {
let imageName = "\(imagePrefix)-\(imageCount).png"
let image = UIImage(named: imageName)! // here is where we need to address memory and not cache the images
//let image = UIImage(contentsOfFile: imageName)! // maybe this?
imageArray.append(image)
}
return imageArray
}
// here we set the animate function values
func animate(imageView: UIImageView, images: [UIImage]){
imageView.animationImages = images
imageView.animationDuration = 1.5
imageView.animationRepeatCount = 1
imageView.startAnimating()
}
// Here we call the animate function
animation1 = createImageArray(total: 28, imagePrefix: "ImageSet1")
animation2 = createImageArray(total: 53, imagePrefix: "ImageSet2")
animation3 = createImageArray(total: 25, imagePrefix: "ImageSet3")
func showAnimation() {
UIView.animate(withDuration: 1, animations: {
animate(imageView: self.animationView, images: self.animation1)
}, completion: { (true) in
//self.animationView.image = nil // Maybe we try the following?
//self.animationView.removeFromSuperview()
//self.animationView = nil
})
}
Based on SO responses, it looks like this may be the best method to prevent the images from being cached, but it doesn't seem to work in my code:
let image = UIImage(contentsOfFile: imageName)!
I have also tried this but it doesn't seem to work either:
func applicationDidReceiveMemoryWarning(application: UIApplication) {
NSURLCache.sharedURLCache().removeAllCachedResponses()
}
I also tried the following article (removeFromSuperview) in the completion block but I couldn't get this to work either (see my code above):
https://www.hackingwithswift.com/example-code/uikit/how-to-animate-views-using-animatewithduration
New Code:
// create the Animation Array for each animation
func createImageArray(total: Int, imagePrefix: String) -> [UIImage]{
var imageArray: [UIImage] = []
for imageCount in 0..<total {
let imageName = "\(imagePrefix)-\(imageCount).png"
//let image = UIImage(named: imageName)! // replaced with your code below
if let imagePath = Bundle.mainBundle.path(forResource: "ImageSet1" ofType: "png"),
let image = UIImage(contentsOfFile: imagePath) {
//Your image has been loaded }
imageArray.append(image)
}
return imageArray }
It's pretty simple. UIImage(named:) caches images, and UIImage(contentsOfFile:) does not.
If you don't want your images to be cached, use UIImage(contentsOfFile:) instead. If you can't get that to work then post your code and we'll help you debug it.
Be aware that UIImage(contentsOfFile:) does not look for files in your app bundle. It expects a full path to the image file. You will need to use Bundle methods to find the path to the file and then pass that path to UIImage(contentsOfFile:):
if let imagePath = Bundle.mainBundle.path(forResource: "ImageSet1" ofType: "png"),
let image = UIImage(contentsOfFile: imagePath) {
//Your image has been loaded
}
Your code is loading all the images for all 3 animations into an array of images and never releasing them, so the fact that the system caches those images seems pretty irrelevant. In a low memory condition the system should flush the cached copies of the images, but your code will still hold all those images in your arrays so the memory won't get freed. It looks to me like it's your code that's causing the memory problem, not the system's images caching.
Your code might look like this:
func createImageArray(total: Int, imagePrefix: String) -> [UIImage]{
var imageArray: [UIImage] = []
for imageCount in 0..<total {
let imageName = "\(imagePrefix)-\(imageCount)"
if let imagePath = Bundle.main.path(forResource: imageName,
ofType: "png"),
let image = UIImage(contentsOfFile: imagePath) {
imageArray.append(image)
}
}
return imageArray
}

Returning a item from an array in function

I have a function that I want to return an UIImage but the UIImage must be in the array of UIImages I have created.
This works but I want it to fail if the item is not a part of the array.
private let screenImages: [UIImage] = [#imageLiteral(resourceName: "screen-1"), #imageLiteral(resourceName: "screen-2"), #imageLiteral(resourceName: "screen-3")]
private let getImage() -> (UIImage) {
let random = Int(arc4random_uniform(UInt32(screenImages.count)))
let image = screenImages[random]
return image
}
So in the escaping of the function am I able to tell the function what kind of image it should return instead of UIImage.
Something like this
private let getImage() -> (UIImage in screenImages)
Shuri2060, answered my question.
What i was looking for was.
if screenImages.contains(x)

UITableView is Lagging when Displaying Images Swift

I have a tableView that shows images in the cells, and I'm fetching the data in viewDidLoad function
func tableView(tableView: UITableView, cellForRowAtIndexPath indexPath: NSIndexPath) -> UITableViewCell {
let cell = tableview.dequeueReusableCellWithIdentifier("cardSelectionCell", forIndexPath: indexPath) as! MyCell
let card = fetchedResultsController.objectAtIndexPath(indexPath) as! Card
cell.Name?.text = card.name
var image: NSData = card.photo as NSData
cell.logo.image = UIImage(data: image)
let date = NSDate()
let formatter = NSDateFormatter()
formatter.timeStyle = .MediumStyle
cell.policyExpiryDate.text = formatter.stringFromDate(date)
return cell
}
But the problem is that when I start scrolling, the tableView is very laggy
so i tried to create a dictionary to convert the images in viewDidLoad()
var imageDictionary = [Card:UIImage]()
AllCards = context.executeFetchRequest(request, error: nil) as! [Card]
for card in AllCards {
imageDictionary[card] = UIImage(data: card.photo as NSData)
}
and in the tableView function:
cell.logo.image = imageDictionary[card]
but it still lagging
one more question: if one card is added to the core data, i have to fetch the Array of images again, so i tried to create the array in viewDidAppear() but the images does not appear in the first load and the tableView is still lagging.
First of all it depends really on the size of your image:
var image: NSData = card.photo as NSData
cell.logo.image = UIImage(data: image)
This lines can be very heavy. You can easily measure it like this for example:
let time1 = CACurrentMediaTime()
var image: NSData = card.photo as NSData
cell.logo.image = UIImage(data: image)
print(CACurrentMediaTime() - time1)
If it takes much time (for example more than 8 milliseconds) than you should optimize it. You can do it it many ways, for example, you can store UIImage references in some dictionary or array and then pass on displaying in cellForIndexPath:.
The other way is going to AsyncKit way - making your UI assync, load data in one thread and pass it to draw on main thread only. Now all you process are in main thread.
UPD: I think that the best approach in this case is to store pathes to images, then run in background process for UIImage resize and return preview. Also you can make cache for resized and not resized UIImage, the path or URL can be a key - so next time it will cost only to take UIImage from cache not to launch all the resize process again.
Also shoul notice that this operation:
cell.logo.image = UIImage(data: image)
create a system cache and can make big allocations in memory, because garbage collector won't kill it immediately, like in the case with UIWebView. So you can add your own autorelease pool like this around your function:
autoreleasepool { () -> () in
cell.logo.image = UIImage(data: image)
}
I was having this issue too, I solved it by using dispatch_async. This way the table cell will load before the image finishes loading.
let priority = DISPATCH_QUEUE_PRIORITY_HIGH
dispatch_async(dispatch_get_global_queue(priority, 0)) {
if let imgData = card.photo as? NSData {
let image = UIImage(data: imageData)
dispatch_async(dispatch_get_main_queue(), { () -> Void in
cell.logo.image = image
})
}
}

How to put image into AutoPurgingImageCache after af_setImageWithURL completed?

I want to use imageView.af_setImageWithURL(URL) together with the AutoPurgingImageCache from AlamofireImage.
How can I be informed that af_setImageWithURL fetched the image, so I can put it into my cache? Because this is done in Background.
I'd prefer something like: imageView.af_imageFromCache(URL) which sets the image from cache if it is already in there or otherwise downloads async, then sets the image and saves it in the cache.
The shared ImageDownloader that the UIImageView uses already puts the image into the AutoPurgingImageCache automatically. Every time you use the imageView.af_setImageWithURL(URL) it will attempt to pull the image out of the image cache if it exists. If the image is not already in the cache, it will start the download on the ImageDownloader instance.
SWIFT 4.0
Set 'AutoPurgingImageCache' in Appdelegate so that i can access it in any class
var imageCache:AutoPurgingImageCache?
imageCache = AutoPurgingImageCache(
memoryCapacity: 100_000_000,
preferredMemoryUsageAfterPurge: 60_000_000
)
Then set image in my service class
class func setImage(url : String,imageview:UIImageView) {
let imageCache = appDelegate.imageCache!
// Fetch
if let cachedAvatar = imageCache.image(withIdentifier: url){
imageview.image = cachedAvatar
}else {
Alamofire.request(url).responseImage { response in
debugPrint(response)
debugPrint(response.result)
if let image = response.result.value {
imageview.image = image
// Add
imageCache.add(image, withIdentifier: url)
}
}
}
}

Resources