Pasteboard UIImage not using scale - ios

I am building a custom keyboard and am having trouble adding an image to the pasteboard and maintaining the appropriate scale and resolution with in the pasted image. Let me start with a screenshot of the keyboard to illustrate my trouble:
So the picture of the face in the top left of the keyboard is just a UIButton with the original photo set to the background. When the button is pressed the image is resized with the following function:
func imageResize(image:UIImage, size:CGSize)-> UIImage {
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(size, false, scale)
var context = UIGraphicsGetCurrentContext()
CGContextSetInterpolationQuality(context, kCGInterpolationHigh)
image.drawInRect(CGRect(origin: CGPointZero, size: size))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return scaledImage
}
This function creates a UIImage the same size as the UIButton with the appropriate scale to reflect the device's screen resolution. To verify that the function is correct, I added an UIImageView filled with the scaled image. The scaled image is the image that looks misplaced near the center of the keyboard. I added the UIImageView with this function:
func addImageToBottomRight(image: UIImage) {
var tempImgView = UIImageView(image: image)
self.view.addSubview(tempImgView)
tempImgView.frame.offset(dx: 100.0, dy: 50.0)
}
I have tried a few different methods for adding the image to the pasteboard, but all seem to ignore the scale of the image and display it twice as large as opposed to displaying it at a higher resolution:
let pb = UIPasteboard.generalPasteboard()!
var pngType = UIPasteboardTypeListImage[0] as! String
var jpegType = UIPasteboardTypeListImage[2] as! String
pb.image = image
pb.setData(UIImagePNGRepresentation(image), forPasteboardType: pngType)
pb.setData(UIImageJPEGRepresentation(image, 1.0), forPasteboardType: jpegType)
All three of these methods do not work correctly and produce the same result as illustrated in the screenshot. Does anyone have any suggestions of other methods? To further clarify my goal, I would like the image in the message text box to look identical to both UIImages in the keyboard in terms of size and resolution.
Here are a few properties of the UIImage before and resize in case anyone is curious:
Starting Image Size is (750.0, 750.0)
Size to scale to is: (78.0, 78.0))
Initial Scale: 1.0
Resized Image Size is (78.0, 78.0)
Resized Image Scale: 2.0

I know this is an old post, but thought I share the work around I found for this specific case of copying images and pasting to messaging apps.The thing is, when you send a picture with such apps like iMessages, whatsapp, messenger, etc, the way they display the image is so that it aspect fits to some certain horizontal width (lets say around 260 pts for this demo).
As you can see from the diagram below, if you send 150x150 image #1x resolution in imessage, it will be stretched and displayed in the required 260 width box, making the image grainy.
But if you add an empty margin of width 185 to both the left and right sides of the image, you will end up with an image of size 520x150. Now if you send that sized image in imessage, it will have to fit it in a 260 width box, ending up cramming a 520x150 image in a 260x75 box, in a way giving you a 75x75 image at #2x resolution.
You can add a clear color margin to a UIImage with a code like this
extension UIImage {
func addMargin(withInsets inset: UIEdgeInsets) -> UIImage? {
let finalSize = CGSize(width: size.width + inset.left + inset.right, height: size.height + inset.top + inset.bottom)
let finalRect = CGRect(origin: CGPoint(x: 0, y: 0), size: finalSize)
UIGraphicsBeginImageContextWithOptions(finalSize, false, scale)
UIColor.clear.setFill()
UIGraphicsGetCurrentContext()?.fill(finalRect)
let pictureOrigin = CGPoint(x: inset.left, y: inset.top)
let pictureRect = CGRect(origin: pictureOrigin, size: size)
draw(in: pictureRect)
let finalImage = UIGraphicsGetImageFromCurrentImageContext()
defer { UIGraphicsEndImageContext() }
return finalImage
}
}

Related

Quality get reduced when convert imageview to image

In photo editor screen , I have imageview and it has background image and on top of imageview I add elements like text (label), stickers(images) etc. , Now for the final image containing all elements added on imageview , I am getting image from below code
clipRect is rect for background image inside imageview, image is aspectfit in imageview
Below is code inside UIView extension which has function to generate image out of view.
self == uiview
let op_format = UIGraphicsImageRendererFormat()
op_format.scale = 1.0
let renderer = UIGraphicsImageRenderer(bounds: CGRect(origin: clipRect!.origin, size: outputSize), format: op_format)
let originalBound = self.bounds
self.bounds = CGRect(origin: clipRect!.origin, size: clipRect!.size)
var finalImage = renderer.image { ctx in
self.drawHierarchy(in: CGRect(origin: self.bounds.origin, size: outputSize), afterScreenUpdates: true)
}
self.bounds = CGRect(origin: originalBound.origin, size: originalBound.size)
Issue here is quality of final image quality is very poor as compared to original background image.
Don't set the scale of your UIGraphicsImageRendererFormat to 1. That forces the output image to #1x (non-retina). For most (all?) iOS devices, that will cause a 2X or 3X loss of resolution. Leave the scale value of the UIGraphicsImageRendererFormat at the default value.
you can take screenshot as well
// Convert a uiview to uiimage
func captureView() -> UIImage {
// Gradually increase the number for high resolution.
let scale = 1.0
UIGraphicsBeginImageContextWithOptions(bounds.size, opaque, scale)
layer.renderInContext(UIGraphicsGetCurrentContext()!)
let image:UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}

How to change the size of a CIImage mask as an input to blendKernel.apply function in Swift

I try to create live hair color mask IOS app. I used UIGraphicsGetImageFromCurrentImageContext() function to merge the mask and the background from video. It works but the image quality is not good enough. I wanted to change my code to use blendKernel.apply(foreground: maskCI!, background: backgroundCI!) function. In that case the size and position of the mask and the background image do not match. Mask image is shown on the right bottom corner of the background image. My background image size is 1080x1080 and mask image is 224x224. Could you please advice me how to overlap the mask and the background? Please see my code below:
class IntegrateViewAndMask {
var blendKernel: CIBlendKernel { return .softLight }
func mergeMaskAndBackground(mask: UIImage, background: CVPixelBuffer, size: Int) -> UIImage? {
// Merge two images
// let sizeImage = CGSize(width: size, height: size)
// UIGraphicsBeginImageContext(sizeImage)
// let areaSize = CGRect(x: 0, y: 0, width: sizeImage.width, height: sizeImage.height)
// background
var background = UIImage(pixelBuffer: background)
// crop image to square 1080 x 1080
background = cropImageToSquare(image:background!)
// background?.draw(in: areaSize)
// mask
// mask.draw(in: areaSize)
let backgroundCI = CIImage(image: background!)
let maskCI = CIImage(image: mask)
let trialImage = blendKernel.apply(foreground: maskCI!, background: backgroundCI!)
let trialImageUI = UIImage(ciImage:trialImage!)
// let newImage:UIImage = UIGraphicsGetImageFromCurrentImageContext()!
// UIGraphicsEndImageContext()
return trialImageUI
I have just found an answer and tried it. Please see below link. As suggested, I resized the UIimage to bounding square and then converted to CIImage. So it works perfectly now.
CIImage extent in pixels or points?

Resizing UIImage to fit table cell ImageView

I have images of the size 176 points by 176 points. I am trying to resize them so that they fit into a tableViewCell's ImageView. There are a few problems that I am coming across.
Problems
I don't know what size the image view in the tableViewCell actually is.
If I simply add the image without resizing it, it is so sharp that it looses detail:
If I use this extension on UIImage (below), then the transparent parts of the UIImage turn to black, not what I want:
extension UIImage {
func resizeImageWithBounds(bounds: CGSize) -> UIImage {
let horizontalRatio = bounds.width/size.width
let verticalRatio = bounds.height/size.height
let ratio = max(horizontalRatio, verticalRatio)
let newSize = CGSize(width: size.width * ratio, height: size.height * ratio)
UIGraphicsBeginImageContextWithOptions(newSize, true, 0)
draw(in: CGRect(origin: CGPoint.zero, size: newSize))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage!
}
}
I am looking for information on how big the UIImageView is and how to best resize an image into it. I really don't want to create another set of assets (I have a lot of images), and I don't think I should have to.
Try changing the contentMode of the cell.
cell.imageView?.contentMode = .scaleAspectFit
Or, if that doesn't work, here's how to fix the issues of your resized images turning black:
UIGraphicsBeginImageContextWithOptions(newSize, false, 0) // <-- changed opaque setting from true to false

UIImageView AspectFill failing

I'm rendering a 16:9 pixel image programatically and then use UIImageView with kCAFilterNearest filter and .ScaleAspectFill content mode to display it as a background image. The result I get is this:
On the picture you can see that, for whatever reason, it scales the picture exactly right but slightly moves it up (like, half a scaled pixel) and leaves a line at the bottom, which belongs to the UIImageView that the image is displayed on (it's the only view on this UIViewController and its background color is "navy").
What might be the reason behind this, considering that I use UIScreen.mainScreen().bounds rectangle for the UIImageView?
P.S. ScaleToFill content mode gives kind of the same result (the line at the bottom remains)
I use a function to resize the image to fit on the screen
fnc imageResize (imageObj:UIImage, sizeChange:CGSize)-> UIImage{
let hasAlpha = false
let scale: CGFloat = 0.0 // Automatically use scale factor of main screen
UIGraphicsBeginImageCOntextWithOptions(sizeChange, !hasAlpha, scale)
imageObj.drawInRect(CGRect(origin: CGPointZero, size : sizeChange))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
retunr scaledImage
}
and to call
let mainScreenSize: CGRect = UIScreen.mainScreen().bounds
let image: UIImage! = self.imageResize(UIImage(named: "IMAGE_NAME")!, sizeChange: CGSizeMake(mainScreenSize.width, mainScreenSize.height))
the initial image that I used was 16:9 ratio as well

UIImageView in UITableViewCell low resolution after resizing

I really got frustrated with auto layout and UITableViewCell stuff. I have a UITableView with dynamic heighted cells. Cells can have an image inside them. Image should fit to UIImageView's width.
Images are larger than UIImageView's size. So, after downloading the image, I resize the image and then put it inside the UIImageView. However, resizing the image with respect to UIImageView's width leads to low resolution as UIScreen's scale is greater than 1. So I tried to increase the expected size of the image after the operation (size= (scale * imageViewWidth, scale * imageViewHeight)).
However, this time, cell's height becomes 2 times bigger. What I'm trying to do is basically, while keeping UIImageView's height the same, increasing the resolution.
class ProgressImageView : UIImageView{
func setImage(imagePath:String?, placeholder:String, showProgress:Bool, showDetail:Bool){
if imagePath == nil{
self.image = UIImage(named: placeholder)
}else{
if showProgress{
self.setShowActivityIndicatorView(true)
self.setIndicatorStyle(.Gray)
}
let placeholderImage = UIImage(named: placeholder)
SDWebImageManager.sharedManager().downloadImageWithURL(NSURL(string: imagePath!), options: SDWebImageOptions.RefreshCached, progress: nil, completed: { (image, error, _, _, _) in
//resize image to fit width
if error != nil{
self.image = placeholderImage
}else {
self.setImageFitWidth(image)
}
})
}
}
}
extension UIImageView{
func setImageFitWidth(image: UIImage){
let w = self.bounds.size.width //* UIScreen.mainScreen().scale
print ("ImageView size:\(self.bounds.size) scaled width:\(w)")
self.image = image.convertToWidth(w)
}
}
extension UIImage{
func convertToWidth(let width: CGFloat) -> UIImage{
print ("Old image size:\(self.size)")
let ratio = self.size.width / width
let height = self.size.height / ratio
let size = CGSizeMake(width, height)
print ("New image size:\(size)")
UIGraphicsBeginImageContext(size)
self.drawInRect(CGRectMake(0, 0, size.width, size.height))
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
}
As you can see in setImageFitWidth method, without UIScreen.mainScreen().scale it works just fine except the resolution and with the scale, UIImageView doubles in size. By the way, I tried all possible options of the content mode (aspect fit, aspect fill etc.) and I am using auto layout.
Result in low resolution:
Result in high resolution:
I want my screen to be just like in the first ss but with resolution of the image in second ss.
Try using UIGraphicsBeginImageContextWithOptions(size: size, opaque: YES, scale: 0)
to draw your image.
Scale 0 means the function will get the proper scale factor from the device.

Resources