Retina screenshot (Swift) - ios

I am trying to take a screenshot of the UIView using the code below but it is creating a non-retina screenshot at half of the sizes of the actual device screen size.
Where am I going wrong?
Also view.frame.size is returning half of the actual values.
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(view.frame.size, false, scale)
view.layer.renderInContext(UIGraphicsGetCurrentContext())
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(newImage, nil, nil, nil)
Thanks

Related

Quality get reduced when convert imageview to image

In photo editor screen , I have imageview and it has background image and on top of imageview I add elements like text (label), stickers(images) etc. , Now for the final image containing all elements added on imageview , I am getting image from below code
clipRect is rect for background image inside imageview, image is aspectfit in imageview
Below is code inside UIView extension which has function to generate image out of view.
self == uiview
let op_format = UIGraphicsImageRendererFormat()
op_format.scale = 1.0
let renderer = UIGraphicsImageRenderer(bounds: CGRect(origin: clipRect!.origin, size: outputSize), format: op_format)
let originalBound = self.bounds
self.bounds = CGRect(origin: clipRect!.origin, size: clipRect!.size)
var finalImage = renderer.image { ctx in
self.drawHierarchy(in: CGRect(origin: self.bounds.origin, size: outputSize), afterScreenUpdates: true)
}
self.bounds = CGRect(origin: originalBound.origin, size: originalBound.size)
Issue here is quality of final image quality is very poor as compared to original background image.
Don't set the scale of your UIGraphicsImageRendererFormat to 1. That forces the output image to #1x (non-retina). For most (all?) iOS devices, that will cause a 2X or 3X loss of resolution. Leave the scale value of the UIGraphicsImageRendererFormat at the default value.
you can take screenshot as well
// Convert a uiview to uiimage
func captureView() -> UIImage {
// Gradually increase the number for high resolution.
let scale = 1.0
UIGraphicsBeginImageContextWithOptions(bounds.size, opaque, scale)
layer.renderInContext(UIGraphicsGetCurrentContext()!)
let image:UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}

UIGraphicsBeginImageContext doesn't return retina image

I want to capture current screen in an image. I'm doing this:
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.isOpaque, UIScreen.main.scale)
self.view.drawHierarchy(in: self.view.bounds, afterScreenUpdates: false)
let snapshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
Problem is the scale parameter. If I understand correctly 0.0 represents non-retina, 2.0 represents retina and 3.0 represents retina for 6 Plus and 7 Plus. No matter what I input into the scale parameter, the output is always image with 375x667 resolution. I also tried different approach:
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.isOpaque, UIScreen.main.scale)
self.view.layer.render(in: UIGraphicsGetCurrentContext()!)
let snapshot: UIImage? = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
Again, same scenario. I'm even using
UIScreen.main.scale
Which in fact returns value 2.0. What am I doing wrong? How do I get a higher resolution image?
This code will do the trick
let contextSize = CGSize(width: self.view.bounds.size.width * UIScreen.main.scale, height: self.view.bounds.size.height * UIScreen.main.scale)
UIGraphicsBeginImageContextWithOptions(contextSize, self.view.isOpaque, UIScreen.main.scale)
self.view.layer.render(in: UIGraphicsGetCurrentContext()!)
let snapshot: UIImage? = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
CGContext don't care about scale, it care about size only.
EDIT
You may want to use the newer API available in iOS 10 and later
let renderer = UIGraphicsImageRenderer(bounds: self.view.bounds)
let snapshot = renderer.image(actions: { context in
self.view.layer.render(in: context.cgContext)
})
snapshot is a UIImage which has two properties, size and scale; much like the screen. To determine the actual size in pixels of the image you must multiply the the size by the scale. I think your issue is that you're assuming the size property is pixels not points.
size
The logical dimensions of the image, measured in points.
You can test this in a very definitive way by creating a JPG using UIImageJPEGRepresentation and saving it to disk to inspect using image tools you're familiar with.

How to take Snap shot of AGSMapView for ios?

UIGraphicsBeginImageContextWithOptions(self.AgsMapView.bounds.size,false, 0.0)
self.view.layer.renderInContext(UIGraphicsGetCurrentContext()!)
snapShot = UIGraphicsGetImageFromCurrentImageContext()
He is my code, from above code getting blank image (white)
Try this code!!
let layer = self.AgsMapView.layer
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(self.AgsMapView.frame.size, false, scale);
layer.renderInContext(UIGraphicsGetCurrentContext()!)
let snapShot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)

Why does the screenshot show just white when I'm trying to take a screenshot of my app programmatically in Swift?

On my app, I have a view which holds a camera. I want to take a screenshot of this view which holds the camera. However when I do this with the following code:
let layer = UIApplication.sharedApplication().keyWindow!.layer
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(layer.frame.size, false, scale);
layer.renderInContext(UIGraphicsGetCurrentContext()!)
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
the screenshot which was saved to photos is just blank and doesn't show the camera view.
Render and capture an UIView named view:
UIGraphicsBeginImageContextWithOptions(view.frame.size, false, 0.0)
view.drawViewHierarchyInRect(view.frame, afterScreenUpdates: true)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext();
Remember that UIWindow is a view too.
You can test in the simulator saving to the desktop with this function:
public func saveImageToDesktop(image : UIImage, name : String)
{
let basePath = NSString(string: "~/Desktop").stringByExpandingTildeInPath
let path = "\(basePath)/\(name).png"
UIImagePNGRepresentation(image)?.writeToFile(path, atomically: true)
}

Issue while trying to save an image to the camera roll SWIFT

Encountered a strange issue while trying to save a view.
The picture saved crops out the image.
here is the code :
let scale = UIScreen.mainScreen().scale
let size:CGSize = CGSize(width: CGFloat(self.customView!.frame.size.width), height: CGFloat(self.customView!.frame.size.height))
UIGraphicsBeginImageContextWithOptions( size, false, scale);
self.customView!.layer.renderInContext(UIGraphicsGetCurrentContext())
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
Please help!
thanks in advance
The camera roll has an auto zoom since the height is a bit smaller than the screen.

Resources