Merge two imageViews into one and save iOS swift - ios

I have 2 imageViews like below. (ScrollView has subview of imageView)
i want to take the image of each one and merged to one.
i tried using taking screenshot and crop. but when it comes to different iphone screensizes and resolutions it doesn't work well.
can any one guide me how to do this?

Why not just add them both to one UIView?
It is also possible to add a subview to your image view and extend the frame.
[secondImageView setFrame:CGRectMake(x,y + newImageHeight,width,height)];
[self.myImageView setFrame:CGRectMake(x,y,width,height + newImageHeight)];
[self.myImageView addSubview:secondImageView];

I think your way is right, also you can solve scale size problem very easily with this method.
func mergeScreenshot() {
let layer = UIApplication.sharedApplication().keyWindow.layer
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(layer.frame.size, false, scale); // reconsider size property for your screenshot
layer.renderInContext(UIGraphicsGetCurrentContext())
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
}
But you should edit sizes again. For example if you're using autolayout you can create IBOutlet and than use it instead of layer.frame.size property which I used.
Hope it helps.

Related

Why would set view.layer.contents instead of using an UIImage

Recently I encountered this code:-
self.view.layer.contents = (id)[UIImage imageNamed:#"img_bg.png"].CGImage;
This setup a background for the whole view of a UIViewController.
Usually, I would just set an UIImageView as subview, taking the whole area of the view from the UIViewController
Why would someone use one technique rather than an other ?
I found this one a bit disturbing because the background does not show in the storyboard; where I expect it.
that's a great question. Deeply, you are asking the different between View and Layer, you know we can think view as layer's delegate which can handle user's interaction with app or system. when there is no any interaction but render view, we can just use layer, that will take up less memory compared with the view。
Try this
self.view.backgroundColor = UIColor(patternImage: UIImage(named:"img_bg.png"))
Hope this will help
Cheers
You can try this:-
UIGraphicsBeginImageContext(self.view.frame.size)
UIImage(named: "img_bg.png")?.drawInRect(self.view.bounds)
let image: UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
self.view.backgroundColor = UIColor(patternImage: image)
Hope this helps,
Thanks

Scale an Image in ImageView (Scale x,y) - IOS

I usually scale the image of an imageView in Android with the scaleX and scaleY properties.
How can I do the same in IOS? I'm really new with IOS,I'm using swift 3 and I didn't find any analog thing like that.
If you need to scale the image proportionally, the following code might work for you:
if let cgImage = imageView.image?.cgImage, let orientation = imageView.image?.imageOrientation {
imageView.image = UIImage(cgImage: cgImage, scale: newScale, orientation: orientation)
}
However, if you only want to display it with a size different than that of the original image, you should control the size of the image using the size of the UIImageView that presents it. To make sure the image scales correctly with the size of the view, take a look at the UIImageView.contentMode property: https://developer.apple.com/reference/uikit/uiview/1622619-contentmode

iOS: Solid Border Outside UIView

A layer's borderWidth and borderColor properties draw a border inside the view. Remykits pointed this out here.
A layer's shadow... properties cannot be used to create a border that both appears on all four sides and is opaque for reasons I showed here.
What I failed to specify in that question (and the reason I've opened a new one) is that I want the border to be outside the view. Increasing the frame of the view to compensate for the space lost, as has been suggested, doesn't work; I'm using a UIImageView, so even if the frame is increased, the image is still cropped.
Another suggestion was to change the contentMode of the UIImageView to .Center, in combination with changing the size of the view, but this doesn't work as the view then isn't the proper size.
The solution I first thought of was to create another UIView "behind" this UIImageView, and give it a backgroundColor to mimic the effect of a border. I also thought of creating a custom subclass of UImageView. Both courses of action, however, involve making calculations based on the frame of the view. I've had many problems with the frame not being set by AutoLayout at the proper time, etc.
Other things that come to mind are digitally adding a border to the image or positioning the image in a specific part of the UIImageView. (My attempt at the latter was imageView.layer.contentsRect = CGRectInset(imageView.bounds, 4, 4), which resulted in a strangely pixellated image.)
To be clear, what I'm looking for is this:
It really feels like there should be a simpler way to do this than creating a new class or view. Any help appreciated.
Aha! Stitching together aykutt's comment about resizing the image and changing the conentMode, Paul Lynch's answer about resizing images, and rene's (life-saving) answer about what to do your subviews actually aren't laid out in viewDidLayoutSubviews:
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
self.myContainer.setNeedsLayout()
self.myContainer.layoutIfNeeded()
var width: CGFloat = 4 //the same width used for the border of the imageView
var rect = CGRectInset(imageView.bounds, width, width)
var size = CGSizeMake(rect.width, rect.height)
UIGraphicsBeginImageContextWithOptions(size, false, 0.0)
image.drawInRect(CGRectMake(0, 0, size.width, size.height))
var new = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
self.imageView.contentMode = .Center
self.imageView.image = new
}

Get an image from two imageViews in Swift

I'm developing an app where I have an area where there is an image in the background, and another image that I can move, like a sticker.
My goal is to create and save an image with the background image and the "sticker" above, using Swift. Here's my function that allows me to do what I want (my background view is called "imageView", my imageview sticker is called "jacques") :
let newSize = CGSizeMake(imageView.image!.size.width, imageView.image!.size.height)
UIGraphicsBeginImageContextWithOptions(newSize, false, 0.0);
imageView.image!.drawInRect(CGRectMake(0,0,newSize.width,newSize.height))
let jacquesX = ((jacques.frame.origin.x - imageView.frame.origin.x) * (imageView.image?.size.width)!) / UIScreen.mainScreen().bounds.width
let jacquesY = ((jacques.frame.origin.y - imageView.frame.origin.y) * (imageView.image?.size.height)!) / UIScreen.mainScreen().bounds.height
let jacquesWidth: CGFloat = jacques.image!.size.width
let jacquesHeight: CGFloat = jacques.image!.size.height
jacques.image!.drawInRect(CGRectMake(jacquesX, jacquesY, jacquesWidth, jacquesHeight))
let newImage: UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(newImage, self, "image:didFinishSavingWithError:contextInfo:", nil)
Unfortunately, my sticker isn't the right size. I think it's well placed in the image, but it's size is way too small. I don't know if my solution is the best one, i'm open to all suggestion. And I'm new, so if you have good practice to share, i'm all ears :)
You could do this in the storyboard. First you could size the picture however you want. To create a sticker effect you could just add imageView, size it to background size,and then put jacques on the storyboard, and finally resize them.

Get height of a hidden, dynamically populated UIView

I am trying to get a "screenshot" of a specific UIView that gets loaded from a xib that is never to be seen by the user.
Here's the code:
let nibViews = NSBundle.mainBundle().loadNibNamed("CardView", owner: ownedBy, options: nil)
if let cardView = nibViews.first as? CardView {
UIGraphicsBeginImageContextWithOptions(cardView.bounds.size, false, UIScreen.mainScreen().scale)
cardView.drawViewHierarchyInRect(cardView.bounds, afterScreenUpdates: true)
let imageAgain = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
//do something with image
The issue I'm having is that the cardView bounds, when I pass them into the UIGraphicsBeginImageContextWithOptions, are not correct. Specifically the height is not correct. If i println(cardView.bounds.size.height) before UIGraphicsBeginImageContextWithOptions, I get a generic "600" for the height... which distorts the image that gets created.
I did find a "solution", but not something I'm feeling comfortable with - if I call the UIGraphicsBeginImageContextWithOptions code block TWICE, on the second pass, the cardView.bounds.size.height returns the CORRECT height based on the content added to the cardView (also taking into account the constraints on the xib)
What seems to be happening is, the cardView's true layout isn't fully realized until drawViewHierarchyInRect is called once.
My question is - what is drawViewHierarchyInRect doing to make cardView report the correct height? And can I somehow do that thing through some separate function without creating 2 screenshots of the UIView?

Resources