Render an view into an image - ios

I'm using the camera to take pictures and when i press a certain button the createImage method will be called. In here i'm trying to render an view called containerView into the image taken by the camera in previewImageView. However the issue is that the image in the camera is the size 1080x1920 and the containerView is a different size depending on the device. For instance on a iphone 6 it will be 375x503. How can i make sure that these two is rendered proberly together like in the below quick illustration?
in my code the containerView is very small in the bottom corner.
My Code at the moment
func createImage() -> UIImage {
//Get the taken image
let fullSizeImage = previewImageView.image
let newOverLayHeight = fullSizeImage!.size.width / self.containerView!.frame.width * self.containerView!.frame.height
print(newOverLayHeight)
//Render containerView to image
UIGraphicsBeginImageContext(CGSizeMake(fullSizeImage!.size.width, newOverLayHeight));
self.containerView!.layer.renderInContext(UIGraphicsGetCurrentContext()!)
let overlayImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//Merge these two images
let newSize = CGSizeMake(fullSizeImage!.size.width, fullSizeImage!.size.height)
UIGraphicsBeginImageContext( newSize )
fullSizeImage!.drawInRect(CGRectMake(0,0,newSize.width,newSize.height))
//draw overlayImage on top of fullsizeImage
overlayImage.drawInRect(CGRectMake(0,fullSizeImage!.size.height - newOverLayHeight, overlayImage!.size.width,overlayImage.size.height), blendMode:CGBlendMode.Normal, alpha:1.0)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
return newImage
}
What i want

Related

Quality get reduced when convert imageview to image

In photo editor screen , I have imageview and it has background image and on top of imageview I add elements like text (label), stickers(images) etc. , Now for the final image containing all elements added on imageview , I am getting image from below code
clipRect is rect for background image inside imageview, image is aspectfit in imageview
Below is code inside UIView extension which has function to generate image out of view.
self == uiview
let op_format = UIGraphicsImageRendererFormat()
op_format.scale = 1.0
let renderer = UIGraphicsImageRenderer(bounds: CGRect(origin: clipRect!.origin, size: outputSize), format: op_format)
let originalBound = self.bounds
self.bounds = CGRect(origin: clipRect!.origin, size: clipRect!.size)
var finalImage = renderer.image { ctx in
self.drawHierarchy(in: CGRect(origin: self.bounds.origin, size: outputSize), afterScreenUpdates: true)
}
self.bounds = CGRect(origin: originalBound.origin, size: originalBound.size)
Issue here is quality of final image quality is very poor as compared to original background image.
Don't set the scale of your UIGraphicsImageRendererFormat to 1. That forces the output image to #1x (non-retina). For most (all?) iOS devices, that will cause a 2X or 3X loss of resolution. Leave the scale value of the UIGraphicsImageRendererFormat at the default value.
you can take screenshot as well
// Convert a uiview to uiimage
func captureView() -> UIImage {
// Gradually increase the number for high resolution.
let scale = 1.0
UIGraphicsBeginImageContextWithOptions(bounds.size, opaque, scale)
layer.renderInContext(UIGraphicsGetCurrentContext()!)
let image:UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}

Image isn't bound to the UIImageView

I have a UIImageView in created in Inspector that I resize in my code based on a selected image which i get from the web. However on first load of the image, it's being displayed in the images normal resolution instead of the UIImageViews newly created bounds.
Resizing the UIImageView:
fullScreenImage.bounds.size = CGSize(width: scaledWidth, height: scaledHeight)
Setting the UIImageView's image
let imageStringURL = images[indexPath.row].urls!["regular"]
let imageURL = URL(string: imageStringURL!)!
let imageData = try! Data(contentsOf: imageURL)
let image = UIImage(data: imageData)
fullScreenImage.image = image
This is how it looks when the image is first clicked on to enter "fullscreen mode"
This is how it looks the second time i click it
Not really sure why the Image isn't bounding itself within the specified UIImageView bounds
Instead of resizing the bound, you can set the UIViewContentMode property of UIImageView. This will resize the imageView image to fit inside the bounds.
fullScreenImage.contentMode = .scaleAspectFit
I found a workaround solution. I tried setting the contentMode to aspect fit, and i also tried enabling clip to bounds in the inspector, however none of them worked. So I simply looked into just resizing the UIImage itself and placing it into the UIImageView.
I found an extension in another post for a UIImage that resizes it
extension UIImage{
func resizeImage(newSize: CGSize) -> UIImage {
// Guard newSize is different
guard self.size != newSize else { return self }
UIGraphicsBeginImageContextWithOptions(newSize, false, 0.0);
self.draw(in: CGRect(x: 0, y: 0, width: newSize.width, height: newSize.height))
let newImage: UIImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
return newImage
}
}
If anyone knows of a better way please let me know :)

Pasteboard UIImage not using scale

I am building a custom keyboard and am having trouble adding an image to the pasteboard and maintaining the appropriate scale and resolution with in the pasted image. Let me start with a screenshot of the keyboard to illustrate my trouble:
So the picture of the face in the top left of the keyboard is just a UIButton with the original photo set to the background. When the button is pressed the image is resized with the following function:
func imageResize(image:UIImage, size:CGSize)-> UIImage {
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(size, false, scale)
var context = UIGraphicsGetCurrentContext()
CGContextSetInterpolationQuality(context, kCGInterpolationHigh)
image.drawInRect(CGRect(origin: CGPointZero, size: size))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage
}
This function creates a UIImage the same size as the UIButton with the appropriate scale to reflect the device's screen resolution. To verify that the function is correct, I added an UIImageView filled with the scaled image. The scaled image is the image that looks misplaced near the center of the keyboard. I added the UIImageView with this function:
func addImageToBottomRight(image: UIImage) {
var tempImgView = UIImageView(image: image)
self.view.addSubview(tempImgView)
tempImgView.frame.offset(dx: 100.0, dy: 50.0)
}
I have tried a few different methods for adding the image to the pasteboard, but all seem to ignore the scale of the image and display it twice as large as opposed to displaying it at a higher resolution:
let pb = UIPasteboard.generalPasteboard()!
var pngType = UIPasteboardTypeListImage[0] as! String
var jpegType = UIPasteboardTypeListImage[2] as! String
pb.image = image
pb.setData(UIImagePNGRepresentation(image), forPasteboardType: pngType)
pb.setData(UIImageJPEGRepresentation(image, 1.0), forPasteboardType: jpegType)
All three of these methods do not work correctly and produce the same result as illustrated in the screenshot. Does anyone have any suggestions of other methods? To further clarify my goal, I would like the image in the message text box to look identical to both UIImages in the keyboard in terms of size and resolution.
Here are a few properties of the UIImage before and resize in case anyone is curious:
Starting Image Size is (750.0, 750.0)
Size to scale to is: (78.0, 78.0))
Initial Scale: 1.0
Resized Image Size is (78.0, 78.0)
Resized Image Scale: 2.0
I know this is an old post, but thought I share the work around I found for this specific case of copying images and pasting to messaging apps.The thing is, when you send a picture with such apps like iMessages, whatsapp, messenger, etc, the way they display the image is so that it aspect fits to some certain horizontal width (lets say around 260 pts for this demo).
As you can see from the diagram below, if you send 150x150 image #1x resolution in imessage, it will be stretched and displayed in the required 260 width box, making the image grainy.
But if you add an empty margin of width 185 to both the left and right sides of the image, you will end up with an image of size 520x150. Now if you send that sized image in imessage, it will have to fit it in a 260 width box, ending up cramming a 520x150 image in a 260x75 box, in a way giving you a 75x75 image at #2x resolution.
You can add a clear color margin to a UIImage with a code like this
extension UIImage {
func addMargin(withInsets inset: UIEdgeInsets) -> UIImage? {
let finalSize = CGSize(width: size.width + inset.left + inset.right, height: size.height + inset.top + inset.bottom)
let finalRect = CGRect(origin: CGPoint(x: 0, y: 0), size: finalSize)
UIGraphicsBeginImageContextWithOptions(finalSize, false, scale)
UIColor.clear.setFill()
UIGraphicsGetCurrentContext()?.fill(finalRect)
let pictureOrigin = CGPoint(x: inset.left, y: inset.top)
let pictureRect = CGRect(origin: pictureOrigin, size: size)
draw(in: pictureRect)
let finalImage = UIGraphicsGetImageFromCurrentImageContext()
defer { UIGraphicsEndImageContext() }
return finalImage
}
}

How to take screenshot of UIScrollView visible area?

How do I take a 1:1 screenshot of UIScrollView visible area? The content may be larger or smaller than UIScrollView bounds as well as half-hidden (I've implemented custom scrolling for smaller content, so it's not in the top-left corner).
I've achieved desired result on simulator, but not on device itself:
-(UIImage *)imageFromCombinedContext:(UIView *)background {
UIImage *image;
CGRect vis = background.bounds;
CGSize size = vis.size;
UIGraphicsBeginImageContext(size);
[background.layer affineTransform];
[background.layer renderInontext:UIGraphicsGetCurrentContext()];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGImageRef imref = CGImageCreateWithImageInRect([image CGImage], vis);
image = [UIImage imageWithCGImage:imref];
CGImageRelease(imref);
return image;
}
Another approach would be to use the contentOffset to adjust the layer's visible area and capture only the currently visible area of UIScrollView.
UIScrollView *contentScrollView;....//scrollview instance
UIGraphicsBeginImageContextWithOptions(contentScrollView.bounds.size,
YES,
[UIScreen mainScreen].scale);
//this is the key
CGPoint offset=contentScrollView.contentOffset;
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), -offset.x, -offset.y);
[contentScrollView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *visibleScrollViewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Cheers :)
Swift version of Abduliam Rehmanius answer.
func screenshot() -> UIImage {
UIGraphicsBeginImageContextWithOptions(self.scrollCrop.bounds.size, true, UIScreen.mainScreen().scale);
//this is the key
let offset:CGPoint = self.scrollCrop.contentOffset;
CGContextTranslateCTM(UIGraphicsGetCurrentContext(), -offset.x, -offset.y);
self.scrollCrop.layer.renderInContext(UIGraphicsGetCurrentContext()!);
let visibleScrollViewImage: UIImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return visibleScrollViewImage;
}
Swift 4 version:
func screenshot() -> UIImage {
UIGraphicsBeginImageContextWithOptions(self.scrollCrop.bounds.size, false, UIScreen.main.scale)
let offset = self.scrollCrop.contentOffset
let thisContext = UIGraphicsGetCurrentContext()
thisContext?.translateBy(x: -offset.x, y: -offset.y)
self.scrollCrop.layer.render(in: thisContext!)
let visibleScrollViewImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return visibleScrollViewImage
}
I've found a solution myself - I took screenshot of the whole view and then crop it to the size and position of UIScrollView frame.
-(UIImage *)imageFromCombinedContext:(UIView *)background
{
UIImage *image;
CGSize size = self.view.frame.size;
UIGraphicsBeginImageContext(size);
[background.layer affineTransform];
[self.view.layer.layer renderInContext:UIGraphicsGetCurrentContext()];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CGImageRef imgRef = CGImageCreateWithImageInRect([image CGImage],background.frame);
image = [UIImage imageWithCGImage:imref];
CGImageRelease(imref);
return image;
}
Swift 4 version of Abduliam Rehmanius answer as UIScrollView extension with translation, no slow cropping
extension UIScrollView {
var snapshotVisibleArea: UIImage? {
UIGraphicsBeginImageContext(bounds.size)
UIGraphicsGetCurrentContext()?.translateBy(x: -contentOffset.x, y: -contentOffset.y)
layer.render(in: UIGraphicsGetCurrentContext()!)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
}
Jeffery Sun has the right answer. Just put your scroll view inside another view. Get container view to render in context. done.
In the code below, cropView contains the scroll view to be captured. The solution is really just that simple.
As I understand the question and why I found this page, the whole content of the scroll view isn't wanted - just the visible portion.
func captureCrop() -> UIImage {
UIGraphicsBeginImageContextWithOptions(self.cropView.frame.size, true, 0.0)
self.cropView.layer.renderInContext(UIGraphicsGetCurrentContext())
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image;
}
#Abduliam Rehmanius's answer has poor performance, since if the UIScrollView contains a large content area, we will draw that entire content area (even outside the visible bounds).
#Concuror's answer has the issue that it will also draw anything that is on top of the UIScrollView.
My solution was to put the UIScrollView inside a UIView called containerView with the same bounds and then render containerView:
containerView.renderInContext(context)
Swift 3.0 :
func captureScreen() -> UIImage? {
UIGraphicsBeginImageContextWithOptions(self.yourScrollViewName.bounds.size, true, UIScreen.main.scale)
let offset:CGPoint = self.yourScrollViewName.contentOffset;
UIGraphicsGetCurrentContext()!.translateBy(x: -offset.x, y: -offset.y);
self.yourScrollViewName.layer.render(in: UIGraphicsGetCurrentContext()!)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
and use it as :
let Image = captureScreen()
Update swift 3+, 4 on #Concuror code
func getImage(fromCombinedContext background: UIView) -> UIImage {
var image: UIImage?
let size: CGSize = view.frame.size
UIGraphicsBeginImageContext(size)
background.layer.affineTransform()
view.layer.render(in: UIGraphicsGetCurrentContext()!)
image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
let imgRef = image?.cgImage?.cropping(to: background.frame)
image = UIImage(cgImage: imgRef!)
// CGImageRelease(imgRef!) // Removing on Swift - 'CGImageRelease' is unavailable: Core Foundation objects are automatically memory managed
return image ?? UIImage()
}
A lot of the answers use UIGraphicsBeginImageContext (pre iOS 10.0) to create an image, this creates an image missing the P3 colour gamut - reference https://stackoverflow.com/a/41288197/2481602
extension UIScrollView {
func asImage() -> UIImage {
let renderer = UIGraphicsImageRenderer(bounds: bounds)
return renderer.image { rendererContext in
inputView?.layer.render(in: rendererContext.cgContext)
layer.render(in: rendererContext.cgContext)
}
}
}
The above will result in a better quality image being produced.
The second image is clearer, and showing more of the colours - this was done with the UIGraphicsImageRenderer rather than the UIGraphicsBeginImageContext (first Image)

Changing the size and location of a UIImage within UIImageView (swift)

In Swift, Im struggling to programically resize an image ( and its location ) within a UIImageView as follows:
var housePic = UIImage(named:"house")
var houseImageView = UIImageView(image: housePic)
I want to be able create a 2nd imageView with a cropped / resized version of the above houseImageView using the same image. The result is to show a section of the image from a grid like this..
All my efforts of resizing gives the following wrong result. I think i need to somehow resize the image and change the imageView bounds ??
If required, I can post lots of example failed code.
If you want to create a 2nd image with size 100,100 and add this image to the previous UIImageView you can do this:
let imageName = "house.png"
let originalImage = UIImage(named:imageName)!
let imageView = UIImageView(image: originalImage)
// this is only to visualization purpose
imageView.backgroundColor = UIColor.lightGrayColor()
self.view.addSubview(imageView)
// create a new image resizing it
let destinationSize = CGSizeMake(100, 100)
UIGraphicsBeginImageContext(destinationSize);
originalImage.drawInRect(CGRectMake(0,0,destinationSize.width,destinationSize.height))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext();
// add the new image to the UIImageView
imageView.image = newImage
imageView.contentMode = UIViewContentMode.Center

Resources