Merging two scrolled and zoomed images in Swift - ios

I try to create a functionallity like in PicFrame or any other application for creating photo collage in one frame.
I've created two scroll views and two image views in these scroll views for scrolling and zooming the images. It works well.
Then I need to create one square image of the two rectangular images.
var firstImage = UIImage(named: leftImagePath)
var secondImage = UIImage(named: rightImagePath)
var size = CGSize(width: 1080, height: 1080)
UIGraphicsBeginImageContext(size)
let leftImageAreaSize = CGRect(x: 0, y: 0, width: size.width / 2, height: size.height)
firstImage!.drawInRect(leftImageAreaSize)
let rightImageAreaSize = CGRect(x: size.width / 2 + 1 , y: 0, width: size.width / 2, height: size.height)
secondImage!.drawInRect(rightImageAreaSize)
var newImage:UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
This code works well, but I need to implement the scroll and zoom values, crop and scale the images using these values before creating one square image.
Can anyone guide me how to do this?

I make PicFrame so I suppose I have some experience here. Although this isn't what I do, and I haven't tried it myself, if you just want a quick image of what you see you could use drawViewHierarchyInRect and capture the screen area.
Otherwise what you want to do is get the CGPoint contentOffset and the CGSize bounds.size from the UIScrollView. Then you modify these by the UIScrollView zoomScale. Make sure your contentSize is the size of the image so that a zoomScale of 1.0 would have the width and height of the original image.
From this you should be able to make a CGRect which is the x, y, width, height of what is visible in your scroll view but translated to the size of the image. Crop the image to this size and then draw it into your final graphics context with your desired CGRect.

Related

UIScrollview zoomToRect center of currently visible view

I have a UIScrollView displaying a image. I want to programmatically zoom in to the a rect somewhere near the center (doesn't have to be exact) of the currently visible area. How would I get the coordinates of this rect to use with zoomToRect? Note that this image could be already zoomed in and only showing a fraction of the scrollView content area.
The the X and Y position of that image are relative to the scrollview's contentSize. The area shown on screen is defined by the scrollview's contentOffset.
You then take the position of your scrollview on screen and the position of the selection rectangle on your screen.
Finally you need to do rather simple maths (a few additions/subtractions) for both X and Y using the above values.
Grab UIImageView's frame and call insetBy(dx:dy:):
Returns a rectangle that is smaller or larger than the source
rectangle, with the same center point.
From Apple Documentation
Here's a quick visualisation in a PlayGround:
let blueSquare = UIView(frame: CGRect(x: 0, y: 0, width: 300, height: 300))
let yellowSquare = UIView(frame: blueSquare.frame.insetBy(dx: 100, dy: 100))
blueSquare.backgroundColor = .blue
yellowSquare.backgroundColor = .yellow
blueSquare.addSubview(yellowSquare)
Will result in this:

Frame of CardIO view

I am working on an iOS app in Swift 3.0 and I have integrated card scanner using Card.IO for iOS.
I am successfully able to scan the card, but the problem is that width and height of the camera view does not take frames as per the requirements.
It only takes width and height in ratio of 3:4.
I want, the camera should take half of the screen height and full screen width, But is is not taking. When I pass the frame as
cardView = CardIOView(frame: CGRect(x: 0, y: 100, width: screen.width, height: screen.height / 2))
it does not take full screen width.
Is it a bug on SDK side, I have tried everything but no success.
If anyone can help.
Thanks in advance!
My suggestion is: you incorrectly determine the screen size: Swift: Determine iOS Screen size
let screenSize = UIScreen.main.bounds
let screenWidth = screenSize.width
let screenHeight = screenSize.height

Save UIImage at full size including transforms from UIImageView

I have an image which at full size is 1800 x 2400 pixels. It's being displayed in a UIImageView at 450 x 600.
I have attached a UIPinchGestureRecognizer and a UIRotationGestureRecognizer to the UIImageView which allows the user to zoom and rotate the image which is working great.
I now need to export the image at the original size (1800 x 2400) with the rotations and transforms and I'm struggling to get this to work.
My code so far:
if let imageView = imageView, let image = imageView.image {
UIGraphicsBeginImageContext(image.size)
let context = UIGraphicsGetCurrentContext();
context?.translateBy(x: 0, y: image.size.height)
context?.scaleBy(x: 1.0, y: -1.0)
context?.rotate(by: -imageView.transform.b)
context?.draw(image.cgImage!, in: CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height))
translatedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext();
}
With the above, the rotation is correct but the image is drawn off center and I can't figure out how to include the zoom.
Graphics is very much my weak point in development and I've been at this for hours with no success.
Never zoom the image itself because it's time consuming and it's not the recommended way.
Use scrollView for zooming and you don't have to think about your image size.Put your large sized images in xcassets. ScrollView has delegate features for zooming. It has some beautiful properties too just like this.
scrollView.minimumZoomScale = 0.5;
scrollView.maximumZoomScale = 6.0;
So Better move your effort to UIScrollView and you will see you don't have to write so much code for that.

Resize an image based on the height of the mobile device

How can I resize an image based on the height of the mobile device that's in landscape mode? I have a wide image (a ruler) and want to be able to slide the image back and forth. I've tried scaling the image, but I can't seem to get it to work.
func resizeImage(size: CGSize) {
let scaleFactor = imageView.bounds.height / size.height
let newHeight = imageView.bounds.height * scaleFactor
let newWidth = imageView.bounds.width * scaleFactor
var newSize: CGSize
newSize = CGSize(newWidth, newHeight)
imageView.frame = CGRect(origin: imageView.frame.origin, size: newSize)
scrollView.contentSize = imageView.bounds.size
scrollView.autoresizingMask = [.flexibleRightMargin, .flexibleLeftMargin, .flexibleTopMargin, .flexibleRightMargin]
self.scrollView.contentMode = UIViewContentMode.scaleAspectFit;
}
If you are using storyboard then it is easier to do with auto layout constraints. You have to use Equal Height Constraint with the Multiplier. Although I am also a beginner and haven't used this method for landscape mode, but I am assuming it ll work for landscape too.
Follow these steps :-
Suppose the view you are using in your storyboard while creating the view is of iPhone 7. So your screen height will 667.
Now, suppose in this view your image looks perfect with height 200.
So take the the ratio 200/667=0.299
Now set Equal Height Constraint by selecting both views (i.e imageView and superView) using the ratio as multiplier.
You can check for other devices and orientations with an option on bottom View as:
In the end, height constraint of image should look like this. My height ratio is 0.4 (pic here)

Swift how to make UIView larger than phone screen- can we have it's size of an image's size then zoom out

I have not enough experience on iOS/swift. I want to make a canvas for drawing image from gallery(which I could from some tutorials) - but cannot make UIView's size same as image size. I had to scale down the image to get in the screen. But I want it other way around.
Let's say if I want to make a UIView with 1600x1200 (my image size) what I have to do? some example code/idea will be great !!!
Try this:
myView.frame = CGRect(origin: CGPoint(x: 0, y: 0), size: CGSize(width: 1600, height:1200))

Resources