For some reason I am unable to add a UIImageView to my app. This is the code I am using and I have searched for quite a while to figure this out but haven't had any luck.
super.viewDidLoad()
let cloudimage = UIImage(named: "cloud")
let cloudView = UIImageView(image: cloudimage)
self.view.addSubview(cloudView)
cloudView.frame = CGRectMake(0,0,100,200)
The image is a .png in my assets folder so I don't think it's that. I do have auto layout settings enabled if that is an issue? I know it can be an issue with moving a UIImageView around by using its frame, but I think I should still be able to place the image in the View no problem with this code.
I am not quite sure what to do any suggestions would be great, this is extremely frustrating.
If your image is nil, your UIImageView will not render anything. Try debug your view hierarchy.
https://developer.apple.com/library/tvos/documentation/DeveloperTools/Conceptual/debugging_with_xcode/chapters/special_debugging_workflows.html
http://www.raywenderlich.com/98356/view-debugging-in-xcode-6
you need to add UIimageview to main view.
self.view.addSubview(cloudView)
Try by giving the image extension also while setting to UIImage
let cloudimage = UIImage(named: "cloud.png")
let cloudView = UIImageView(image: cloudimage)
cloudView.frame = CGRectMake(0,0,100,200)
self.view.addSubview(cloudView)
Checked and its working.
Related
I have a UIImageView whose user interaction is true and to which I have given a tap gesture recognizer, whose action handler is as follows:
#IBAction func tap(_ sender:UITapGestureRecognizer) {
let iv = sender.view as! UIImageView
let im = iv.image!
let im2 = UIGraphicsImageRenderer(size:im.size).image { _ in
UIColor.red.setFill()
UIBezierPath(rect: CGRect(origin:.zero, size:im.size)).fill()
}
iv.image = im2
}
I expect the image displayed, when I tap the image view, to be replaced by a solid red image. This works fine on my High Sierra machine running Xcode 9.4. But on my Sierra MacBook running Xcode 9.2, nothing visibly happens.
It's weird. By pausing in the debugger, I can see that the new image is being constructed correctly:
The image is being replaced, but the image view isn't being redrawn. Adding calls like setNeedsDisplay does nothing.
Moreover, if I then proceed to replace the image view's image with a different image, I see the red image!
iv.image = im2
delay(0.5) {
iv.image = im // causes im2 to appear!
}
Some sort of behind-the-scenes caching is evidently causing the image view to get behind in its display by one image.
Can anyone shed light on this? It's presumably a bug in iOS itself, and perhaps in 9.2 specifically; how would one work around it? (Obviously one could substitute another image view wholesale, but that wouldn't tell us what's going on with the caching.)
This seems to be a workaround:
iv.image = im2
delay(0.05) {
iv.image = nil
iv.image = im2
}
But what a horror... Omitting any of those assignments, or reducing the delay to zero (e.g. by calling DispatchQueue.main.async instead), causes the workaround to fail.
Encountered this problem in Xcode 13. Set the contentModel to center in the xib file
or
iv.contentMode = .center
Recently I encountered this code:-
self.view.layer.contents = (id)[UIImage imageNamed:#"img_bg.png"].CGImage;
This setup a background for the whole view of a UIViewController.
Usually, I would just set an UIImageView as subview, taking the whole area of the view from the UIViewController
Why would someone use one technique rather than an other ?
I found this one a bit disturbing because the background does not show in the storyboard; where I expect it.
that's a great question. Deeply, you are asking the different between View and Layer, you know we can think view as layer's delegate which can handle user's interaction with app or system. when there is no any interaction but render view, we can just use layer, that will take up less memory compared with the view。
Try this
self.view.backgroundColor = UIColor(patternImage: UIImage(named:"img_bg.png"))
Hope this will help
Cheers
You can try this:-
UIGraphicsBeginImageContext(self.view.frame.size)
UIImage(named: "img_bg.png")?.drawInRect(self.view.bounds)
let image: UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
self.view.backgroundColor = UIColor(patternImage: image)
Hope this helps,
Thanks
Within an Swift application exporting the individual single viewController images (using the drawViewHierarchyInRect).
For each viewController I recover the contents of imageViews as follows:
self.imageView = (aDecoder.decodeObjectForKey("imageLevel") as? UIImageView)
self.imageView!.contentMode = UIViewContentMode.ScaleAspectFit
self.imageView!.multipleTouchEnabled = true
self.imageView!.autoresizesSubviews = true
self.addSubview(imageView!)
After 15 "page": crash. If I comment these lines, the code works again. Is there an alternative to decodeObjectForKey?
I'm not really sure what you're doing here, but you shouldn't be trying to encode a UIImageView and save it off as the view is tied to other items. You could encode the UIImage instead. Your UIImageView should come from your storyboard/xib and then you set the image you decoded to the UIImageView.
I have 2 imageViews like below. (ScrollView has subview of imageView)
i want to take the image of each one and merged to one.
i tried using taking screenshot and crop. but when it comes to different iphone screensizes and resolutions it doesn't work well.
can any one guide me how to do this?
Why not just add them both to one UIView?
It is also possible to add a subview to your image view and extend the frame.
[secondImageView setFrame:CGRectMake(x,y + newImageHeight,width,height)];
[self.myImageView setFrame:CGRectMake(x,y,width,height + newImageHeight)];
[self.myImageView addSubview:secondImageView];
I think your way is right, also you can solve scale size problem very easily with this method.
func mergeScreenshot() {
let layer = UIApplication.sharedApplication().keyWindow.layer
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(layer.frame.size, false, scale); // reconsider size property for your screenshot
layer.renderInContext(UIGraphicsGetCurrentContext())
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
}
But you should edit sizes again. For example if you're using autolayout you can create IBOutlet and than use it instead of layer.frame.size property which I used.
Hope it helps.
so what I currently wish to accomplish is for an image to be printed on the background of the IOS simulator screen, so inside of my viewDidLoad, I have this
var img = UIImage(named: "paper.jpg")
This can create the image variable, but I haven't found how to display it on the screen yet. It may seem like a trivial problem, but I haven't found any documentation on this online after searching for awhile. Thanks for reading.
Refer to the UIColor documentation.
In Swift, you have to call a convenience initializer. This is because in Swift, all Objective-C class methods which return an instance of their class become convenience initializers.
Here's how it looks in Swift:
self.view.backgroundColor = UIColor(patternImage: UIImage(named: "paper.jpg"))
+ (UIColor *)colorWithPatternImage:(UIImage *)image returns a UIColor instance, so it will become a convenience initializer in Swift. Similarly, UIImage imageNamed: becomes init(patternImage image: UIImage!).
since this is the marked answer, I felt the need to add a bit more code for completion.
as #senior has posted in his answer another way to add an image to your background is by the use of adding a UIImageView as a subview like so:
let img = UIImage(named: "paper.jpg")
let imgView = UIImageView(image: img)
self.view.addSubview(imgView)
You have to add your image to an ImageView and add this to the current view as a subview,
let img = UIImage(named: "paper.jpg")
let imgView = UIImageView(image: img)
self.view.addSubview(imgView)