Quality get reduced when convert imageview to image - ios

In photo editor screen , I have imageview and it has background image and on top of imageview I add elements like text (label), stickers(images) etc. , Now for the final image containing all elements added on imageview , I am getting image from below code
clipRect is rect for background image inside imageview, image is aspectfit in imageview
Below is code inside UIView extension which has function to generate image out of view.
self == uiview
let op_format = UIGraphicsImageRendererFormat()
op_format.scale = 1.0
let renderer = UIGraphicsImageRenderer(bounds: CGRect(origin: clipRect!.origin, size: outputSize), format: op_format)
let originalBound = self.bounds
self.bounds = CGRect(origin: clipRect!.origin, size: clipRect!.size)
var finalImage = renderer.image { ctx in
self.drawHierarchy(in: CGRect(origin: self.bounds.origin, size: outputSize), afterScreenUpdates: true)
}
self.bounds = CGRect(origin: originalBound.origin, size: originalBound.size)
Issue here is quality of final image quality is very poor as compared to original background image.

Don't set the scale of your UIGraphicsImageRendererFormat to 1. That forces the output image to #1x (non-retina). For most (all?) iOS devices, that will cause a 2X or 3X loss of resolution. Leave the scale value of the UIGraphicsImageRendererFormat at the default value.

you can take screenshot as well
// Convert a uiview to uiimage
func captureView() -> UIImage {
// Gradually increase the number for high resolution.
let scale = 1.0
UIGraphicsBeginImageContextWithOptions(bounds.size, opaque, scale)
layer.renderInContext(UIGraphicsGetCurrentContext()!)
let image:UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}

Related

How to change the size of a CIImage mask as an input to blendKernel.apply function in Swift

I try to create live hair color mask IOS app. I used UIGraphicsGetImageFromCurrentImageContext() function to merge the mask and the background from video. It works but the image quality is not good enough. I wanted to change my code to use blendKernel.apply(foreground: maskCI!, background: backgroundCI!) function. In that case the size and position of the mask and the background image do not match. Mask image is shown on the right bottom corner of the background image. My background image size is 1080x1080 and mask image is 224x224. Could you please advice me how to overlap the mask and the background? Please see my code below:
class IntegrateViewAndMask {
var blendKernel: CIBlendKernel { return .softLight }
func mergeMaskAndBackground(mask: UIImage, background: CVPixelBuffer, size: Int) -> UIImage? {
// Merge two images
// let sizeImage = CGSize(width: size, height: size)
// UIGraphicsBeginImageContext(sizeImage)
// let areaSize = CGRect(x: 0, y: 0, width: sizeImage.width, height: sizeImage.height)
// background
var background = UIImage(pixelBuffer: background)
// crop image to square 1080 x 1080
background = cropImageToSquare(image:background!)
// background?.draw(in: areaSize)
// mask
// mask.draw(in: areaSize)
let backgroundCI = CIImage(image: background!)
let maskCI = CIImage(image: mask)
let trialImage = blendKernel.apply(foreground: maskCI!, background: backgroundCI!)
let trialImageUI = UIImage(ciImage:trialImage!)
// let newImage:UIImage = UIGraphicsGetImageFromCurrentImageContext()!
// UIGraphicsEndImageContext()
return trialImageUI
I have just found an answer and tried it. Please see below link. As suggested, I resized the UIimage to bounding square and then converted to CIImage. So it works perfectly now.
CIImage extent in pixels or points?

Resizing UIImage to fit table cell ImageView

I have images of the size 176 points by 176 points. I am trying to resize them so that they fit into a tableViewCell's ImageView. There are a few problems that I am coming across.
Problems
I don't know what size the image view in the tableViewCell actually is.
If I simply add the image without resizing it, it is so sharp that it looses detail:
If I use this extension on UIImage (below), then the transparent parts of the UIImage turn to black, not what I want:
extension UIImage {
func resizeImageWithBounds(bounds: CGSize) -> UIImage {
let horizontalRatio = bounds.width/size.width
let verticalRatio = bounds.height/size.height
let ratio = max(horizontalRatio, verticalRatio)
let newSize = CGSize(width: size.width * ratio, height: size.height * ratio)
UIGraphicsBeginImageContextWithOptions(newSize, true, 0)
draw(in: CGRect(origin: CGPoint.zero, size: newSize))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}
}
I am looking for information on how big the UIImageView is and how to best resize an image into it. I really don't want to create another set of assets (I have a lot of images), and I don't think I should have to.
Try changing the contentMode of the cell.
cell.imageView?.contentMode = .scaleAspectFit
Or, if that doesn't work, here's how to fix the issues of your resized images turning black:
UIGraphicsBeginImageContextWithOptions(newSize, false, 0) // <-- changed opaque setting from true to false

UIImageView AspectFill failing

I'm rendering a 16:9 pixel image programatically and then use UIImageView with kCAFilterNearest filter and .ScaleAspectFill content mode to display it as a background image. The result I get is this:
On the picture you can see that, for whatever reason, it scales the picture exactly right but slightly moves it up (like, half a scaled pixel) and leaves a line at the bottom, which belongs to the UIImageView that the image is displayed on (it's the only view on this UIViewController and its background color is "navy").
What might be the reason behind this, considering that I use UIScreen.mainScreen().bounds rectangle for the UIImageView?
P.S. ScaleToFill content mode gives kind of the same result (the line at the bottom remains)
I use a function to resize the image to fit on the screen
fnc imageResize (imageObj:UIImage, sizeChange:CGSize)-> UIImage{
let hasAlpha = false
let scale: CGFloat = 0.0 // Automatically use scale factor of main screen
UIGraphicsBeginImageCOntextWithOptions(sizeChange, !hasAlpha, scale)
imageObj.drawInRect(CGRect(origin: CGPointZero, size : sizeChange))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
retunr scaledImage
}
and to call
let mainScreenSize: CGRect = UIScreen.mainScreen().bounds
let image: UIImage! = self.imageResize(UIImage(named: "IMAGE_NAME")!, sizeChange: CGSizeMake(mainScreenSize.width, mainScreenSize.height))
the initial image that I used was 16:9 ratio as well

Text size is not same when draw text on UIimage

I am working on project where I need to add text to the image which is coming from the textfield.
But when I see the text on the image it shows the font size smaller than the font size of textfield.
I am using following method to draw text on image
func drawText(text: String, inImage image: UIImage, atPoint point: CGPoint, fontName:String, fontSize: String,textColor: UIColor) -> UIImage? {
UIGraphicsBeginImageContextWithOptions(image.size, true, UIScreen.mainScreen().scale)
UIGraphicsBeginImageContext(image.size)
image.drawInRect(CGRectMake(0, 0, image.size.width, image.size.height))
let rect: CGRect = CGRectMake(point.x, point.y, image.size.width, image.size.height)
// UIColor.whiteColor().set()
// set the font to Helvetica Neue 18
if let sizeFont = NSNumberFormatter().numberFromString(fontSize) {
// TODO: - Need to resolve issue
let originalSize = sizeFont.integerValue
let finalFontSize = CGFloat(originalSize)
let fieldFont = UIFont(name: fontName, size:finalFontSize*1.5)
// set the line spacing to 6
let paraStyle = NSMutableParagraphStyle()
// paraStyle.lineSpacing = 6.0
// set the Obliqueness to 0.1
// let skew = 0.1
let attributes: NSDictionary = [
NSForegroundColorAttributeName: textColor,
// NSParagraphStyleAttributeName: paraStyle,
// NSObliquenessAttributeName: skew,
NSFontAttributeName: fieldFont!
]
NSString(string: text).drawInRect(rect, withAttributes: attributes as? [String : AnyObject])
let newImage: UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
return nil
}
And font size could be 16-60px.
Please let me know what could be the solution.
Thanks
Everything seems fine with your code.
One possible problem is that you don't see the image at full size inside your ImageView because it scales it, so you see the font smaller than what you want.
You could resize the image before drawing the text on it to fit the container it will be displayed on.
Or you can calculate the fontSize multiplying it with 1/scale, with scale = the scale at will the image will be shown
For example if the image is taller than larger and the container (the imageView) is smaller than the image scale will be image.size.height/imageView.frame.size.height.
I think that this could resolve your problem.
As suggested by LorenzOliveto, this could be due to scaling issues. One of the easy workaround would be to add text as a subview of imageView and then convert the imageView to image using this extension.
extension UIView {
/**
Convert UIView to UIImage
*/
func toImage() -> UIImage {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, self.isOpaque, 0.0)
self.drawHierarchy(in: self.bounds, afterScreenUpdates: false)
let snapshotImageFromMyView = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return snapshotImageFromMyView!
}
}

Pasteboard UIImage not using scale

I am building a custom keyboard and am having trouble adding an image to the pasteboard and maintaining the appropriate scale and resolution with in the pasted image. Let me start with a screenshot of the keyboard to illustrate my trouble:
So the picture of the face in the top left of the keyboard is just a UIButton with the original photo set to the background. When the button is pressed the image is resized with the following function:
func imageResize(image:UIImage, size:CGSize)-> UIImage {
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(size, false, scale)
var context = UIGraphicsGetCurrentContext()
CGContextSetInterpolationQuality(context, kCGInterpolationHigh)
image.drawInRect(CGRect(origin: CGPointZero, size: size))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage
}
This function creates a UIImage the same size as the UIButton with the appropriate scale to reflect the device's screen resolution. To verify that the function is correct, I added an UIImageView filled with the scaled image. The scaled image is the image that looks misplaced near the center of the keyboard. I added the UIImageView with this function:
func addImageToBottomRight(image: UIImage) {
var tempImgView = UIImageView(image: image)
self.view.addSubview(tempImgView)
tempImgView.frame.offset(dx: 100.0, dy: 50.0)
}
I have tried a few different methods for adding the image to the pasteboard, but all seem to ignore the scale of the image and display it twice as large as opposed to displaying it at a higher resolution:
let pb = UIPasteboard.generalPasteboard()!
var pngType = UIPasteboardTypeListImage[0] as! String
var jpegType = UIPasteboardTypeListImage[2] as! String
pb.image = image
pb.setData(UIImagePNGRepresentation(image), forPasteboardType: pngType)
pb.setData(UIImageJPEGRepresentation(image, 1.0), forPasteboardType: jpegType)
All three of these methods do not work correctly and produce the same result as illustrated in the screenshot. Does anyone have any suggestions of other methods? To further clarify my goal, I would like the image in the message text box to look identical to both UIImages in the keyboard in terms of size and resolution.
Here are a few properties of the UIImage before and resize in case anyone is curious:
Starting Image Size is (750.0, 750.0)
Size to scale to is: (78.0, 78.0))
Initial Scale: 1.0
Resized Image Size is (78.0, 78.0)
Resized Image Scale: 2.0
I know this is an old post, but thought I share the work around I found for this specific case of copying images and pasting to messaging apps.The thing is, when you send a picture with such apps like iMessages, whatsapp, messenger, etc, the way they display the image is so that it aspect fits to some certain horizontal width (lets say around 260 pts for this demo).
As you can see from the diagram below, if you send 150x150 image #1x resolution in imessage, it will be stretched and displayed in the required 260 width box, making the image grainy.
But if you add an empty margin of width 185 to both the left and right sides of the image, you will end up with an image of size 520x150. Now if you send that sized image in imessage, it will have to fit it in a 260 width box, ending up cramming a 520x150 image in a 260x75 box, in a way giving you a 75x75 image at #2x resolution.
You can add a clear color margin to a UIImage with a code like this
extension UIImage {
func addMargin(withInsets inset: UIEdgeInsets) -> UIImage? {
let finalSize = CGSize(width: size.width + inset.left + inset.right, height: size.height + inset.top + inset.bottom)
let finalRect = CGRect(origin: CGPoint(x: 0, y: 0), size: finalSize)
UIGraphicsBeginImageContextWithOptions(finalSize, false, scale)
UIColor.clear.setFill()
UIGraphicsGetCurrentContext()?.fill(finalRect)
let pictureOrigin = CGPoint(x: inset.left, y: inset.top)
let pictureRect = CGRect(origin: pictureOrigin, size: size)
draw(in: pictureRect)
let finalImage = UIGraphicsGetImageFromCurrentImageContext()
defer { UIGraphicsEndImageContext() }
return finalImage
}
}

Resources