Memory issue decodeObjectForKey (Swift project) - ios

Within an Swift application exporting the individual single viewController images (using the drawViewHierarchyInRect).
For each viewController I recover the contents of imageViews as follows:
self.imageView = (aDecoder.decodeObjectForKey("imageLevel") as? UIImageView)
self.imageView!.contentMode = UIViewContentMode.ScaleAspectFit
self.imageView!.multipleTouchEnabled = true
self.imageView!.autoresizesSubviews = true
self.addSubview(imageView!)
After 15 "page": crash. If I comment these lines, the code works again. Is there an alternative to decodeObjectForKey?

I'm not really sure what you're doing here, but you shouldn't be trying to encode a UIImageView and save it off as the view is tied to other items. You could encode the UIImage instead. Your UIImageView should come from your storyboard/xib and then you set the image you decoded to the UIImageView.

Related

UIImageView image does not update visibly when image property is set

I have a UIImageView whose user interaction is true and to which I have given a tap gesture recognizer, whose action handler is as follows:
#IBAction func tap(_ sender:UITapGestureRecognizer) {
let iv = sender.view as! UIImageView
let im = iv.image!
let im2 = UIGraphicsImageRenderer(size:im.size).image { _ in
UIColor.red.setFill()
UIBezierPath(rect: CGRect(origin:.zero, size:im.size)).fill()
}
iv.image = im2
}
I expect the image displayed, when I tap the image view, to be replaced by a solid red image. This works fine on my High Sierra machine running Xcode 9.4. But on my Sierra MacBook running Xcode 9.2, nothing visibly happens.
It's weird. By pausing in the debugger, I can see that the new image is being constructed correctly:
The image is being replaced, but the image view isn't being redrawn. Adding calls like setNeedsDisplay does nothing.
Moreover, if I then proceed to replace the image view's image with a different image, I see the red image!
iv.image = im2
delay(0.5) {
iv.image = im // causes im2 to appear!
}
Some sort of behind-the-scenes caching is evidently causing the image view to get behind in its display by one image.
Can anyone shed light on this? It's presumably a bug in iOS itself, and perhaps in 9.2 specifically; how would one work around it? (Obviously one could substitute another image view wholesale, but that wouldn't tell us what's going on with the caching.)
This seems to be a workaround:
iv.image = im2
delay(0.05) {
iv.image = nil
iv.image = im2
}
But what a horror... Omitting any of those assignments, or reducing the delay to zero (e.g. by calling DispatchQueue.main.async instead), causes the workaround to fail.
Encountered this problem in Xcode 13. Set the contentModel to center in the xib file
or
iv.contentMode = .center

Unable to add UIImageView Xcode Swift

For some reason I am unable to add a UIImageView to my app. This is the code I am using and I have searched for quite a while to figure this out but haven't had any luck.
super.viewDidLoad()
let cloudimage = UIImage(named: "cloud")
let cloudView = UIImageView(image: cloudimage)
self.view.addSubview(cloudView)
cloudView.frame = CGRectMake(0,0,100,200)
The image is a .png in my assets folder so I don't think it's that. I do have auto layout settings enabled if that is an issue? I know it can be an issue with moving a UIImageView around by using its frame, but I think I should still be able to place the image in the View no problem with this code.
I am not quite sure what to do any suggestions would be great, this is extremely frustrating.
If your image is nil, your UIImageView will not render anything. Try debug your view hierarchy.
https://developer.apple.com/library/tvos/documentation/DeveloperTools/Conceptual/debugging_with_xcode/chapters/special_debugging_workflows.html
http://www.raywenderlich.com/98356/view-debugging-in-xcode-6
you need to add UIimageview to main view.
self.view.addSubview(cloudView)
Try by giving the image extension also while setting to UIImage
let cloudimage = UIImage(named: "cloud.png")
let cloudView = UIImageView(image: cloudimage)
cloudView.frame = CGRectMake(0,0,100,200)
self.view.addSubview(cloudView)
Checked and its working.

Get height of a hidden, dynamically populated UIView

I am trying to get a "screenshot" of a specific UIView that gets loaded from a xib that is never to be seen by the user.
Here's the code:
let nibViews = NSBundle.mainBundle().loadNibNamed("CardView", owner: ownedBy, options: nil)
if let cardView = nibViews.first as? CardView {
UIGraphicsBeginImageContextWithOptions(cardView.bounds.size, false, UIScreen.mainScreen().scale)
cardView.drawViewHierarchyInRect(cardView.bounds, afterScreenUpdates: true)
let imageAgain = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
//do something with image
The issue I'm having is that the cardView bounds, when I pass them into the UIGraphicsBeginImageContextWithOptions, are not correct. Specifically the height is not correct. If i println(cardView.bounds.size.height) before UIGraphicsBeginImageContextWithOptions, I get a generic "600" for the height... which distorts the image that gets created.
I did find a "solution", but not something I'm feeling comfortable with - if I call the UIGraphicsBeginImageContextWithOptions code block TWICE, on the second pass, the cardView.bounds.size.height returns the CORRECT height based on the content added to the cardView (also taking into account the constraints on the xib)
What seems to be happening is, the cardView's true layout isn't fully realized until drawViewHierarchyInRect is called once.
My question is - what is drawViewHierarchyInRect doing to make cardView report the correct height? And can I somehow do that thing through some separate function without creating 2 screenshots of the UIView?

Generating the image on the screen in IOS Simulator

so what I currently wish to accomplish is for an image to be printed on the background of the IOS simulator screen, so inside of my viewDidLoad, I have this
var img = UIImage(named: "paper.jpg")
This can create the image variable, but I haven't found how to display it on the screen yet. It may seem like a trivial problem, but I haven't found any documentation on this online after searching for awhile. Thanks for reading.
Refer to the UIColor documentation.
In Swift, you have to call a convenience initializer. This is because in Swift, all Objective-C class methods which return an instance of their class become convenience initializers.
Here's how it looks in Swift:
self.view.backgroundColor = UIColor(patternImage: UIImage(named: "paper.jpg"))
+ (UIColor *)colorWithPatternImage:(UIImage *)image returns a UIColor instance, so it will become a convenience initializer in Swift. Similarly, UIImage imageNamed: becomes init(patternImage image: UIImage!).
since this is the marked answer, I felt the need to add a bit more code for completion.
as #senior has posted in his answer another way to add an image to your background is by the use of adding a UIImageView as a subview like so:
let img = UIImage(named: "paper.jpg")
let imgView = UIImageView(image: img)
self.view.addSubview(imgView)
You have to add your image to an ImageView and add this to the current view as a subview,
let img = UIImage(named: "paper.jpg")
let imgView = UIImageView(image: img)
self.view.addSubview(imgView)

Flip UIImageViews for Right to Left Languages

iOS automatically flips the entire ViewController when using a RTL language like Arabic and does a great job with most of the layout, especially text. The default behavior is to flip the layout but leave UIImageViews the same orientation (since you generally don't want to reverse images).
Is there a way to specify that some images should be flipped (such as arrows) when the phone is set to a RTL language?
iOS 9 includes the imageFlippedForRightToLeftLayoutDirection method that you can use, that automatically flips the image in a UIImageView when in an RTL localization.
The best solution I found to date is marking the image in the assets file as mirror.
We can use imageFlippedForRightToLeftLayoutDirection which returns flipped image if current language is RTL(right to left). i.e
Objective-c
UIImage * flippedImage = [[UIImage imageNamed:#"imageName"] imageFlippedForRightToLeftLayoutDirection];
Swift 3
let flippedImage = UIImage(named: "imageName")?.imageFlippedForRightToLeftLayoutDirection()
Source: Apple Docs
You have to manually flip the UIImages in the UIImageViews you want when the phone is set to a RTL language. This can be easily achieved with this code:
UIImage* defaultImage = [UIImage imageNamed:#"default.png"];
UIImage* flipImage = [UIImage imageWithCGImage:sourceImage.CGImage scale:1.0 orientation: UIImageOrientationUpMirrored];
myImageview.image = flipImage;
I ended up using localized images for the forward and back arrows. This had the advantage of not having to add code each place the image was used and gives the opportunity to clean up the arrows if there are gradients that don't work well flipped.
While we wait for iOS 9 improved right to left support you could create a UIImageView subclass and override setImage to mirror inside the images as #nikos-m suggests and calling super.image = flipImage.
That way you can easily set all the images views you want to flip using custom classes in Interface Builder instead of having to add IBOutlets.
Swift 5
If you want to individualize the image flip, you can register each image with the direction you want since the layout direction is a trait:
let leftToRight = UITraitCollection(layoutDirection: .leftToRight)
let rightToLeft = UITraitCollection(layoutDirection: .rightToLeft)
let imageAsset = UIImageAsset()
let leftToRightImage = UIImage(named: "leftToRightImage")!
let rightToLeftImage = UIImage(named: "rightToLeftImage")!
imageAsset.register(leftToRightImage, with: leftToRight)
imageAsset.register(rightToLeftImage, with: rightToLeft)
This is the same as configuring it in the asset catalogue as #SergioM answered.

Resources