Swift - Bad Quality image when loaded in PDF - using TPPDF - ios

While using TPPDF library, when a high-quality image is loaded inside a pdf, the image turns blurry.
I tried all the possible methods.
Here is my Code:
let coverImage = PDFImage(image: UIImage(named: "Banner")!, quality: 1, options: [.resize])
//To find proportional width and height
let imageWidth: CGFloat = coverImage.image.size.width
let imageHeight: CGFloat = coverImage.image.size.height
let targetHeight: CGFloat = imageHeight/imageWidth
print("PdfWidth", pdf.layout.width)
print("PdfHeight", pdf.layout.height)
print("TargetHeight", pdf.layout.width*targetHeight)
print("ImageWidth", coverImage.image.size.width)
print("ImageHeight", coverImage.image.size.height)
coverImage.size = CGSize(width: pdf.layout.width, height: pdf.layout.width*targetHeight)
print("size", coverImage.size)
pdf.addImage(.contentCenter, image: coverImage)
log, I get is:
PdfWidth 595.0
PdfHeight 842.0
TargetHeight 44.5
ImageWidth 595.0
ImageHeight 44.5
size (595.0, 44.5)
I also tried to add a high-resolution image of 8000x600, but still, the image is blurry.

Related

How to convert VNRectangleObservation item to UIImage in SwiftUI

I was able to identify squares from a images using VNDetectRectanglesRequest. Now I want those rectangles to store as separate images (UIImage or cgImage). Below is what I tried.
let rectanglesDetection = VNDetectRectanglesRequest { request, error in
rectangles = request.results as! [VNRectangleObservation]
rectangles.sort{$0.boundingBox.origin.y > $1.boundingBox.origin.y}
for rectangle in rectangles {
let rect = rectangle.boundingBox
let imageRef = cgImage.cropping(to: rect)
let image = UIImage(cgImage: imageRef!, scale: image!.scale, orientation: image!.imageOrientation)
checkBoxImages.append(image)
}
Can anybody point out what's wrong or what should be the best approach?
Update 1
At this stage, I'm testing with an image that I added to the assets.
With this image I get 7 rectangles as observations as each for each cell and one for the table margin.
My task is to identify the text inside in each rectangle and my approach is to send VNRecognizeTextRequest for each rectangle that has been identified. My real scenario is little complicated than this but I want to at least achieve this before going forward.
Update 2
for rectangle in rectangles {
let trueX = rectangle.boundingBox.minX * image!.size.width
let trueY = rectangle.boundingBox.minY * image!.size.height
let width = rectangle.boundingBox.width * image!.size.width
let height = rectangle.boundingBox.height * image!.size.height
print("x = " , trueX , " y = " , trueY , " width = " , width , " height = " , height)
let cropZone = CGRect(x: trueX, y: trueY, width: width, height: height)
guard let cutImageRef: CGImage = image?.cgImage?.cropping(to:cropZone)
else {
return
}
let croppedImage: UIImage = UIImage(cgImage: cutImageRef)
croppedImages.append(croppedImage)
}
My image width and height is
width = 406.0 height = 368.0
I've taken my debug interface for you to get a proper understand.
As #Lasse mentioned, this is my actual issue with screenshots.
This is just a guess since you didn't state what the actual problem is, but probably you're getting a zero-sized image for each VNRectangleObservation.
The reason is: Vision uses a normalized coordinate space from 0.0 to 1.0 with lower left origin.
So in order to get the correct rectangle of your original image, you need to convert the rect from Normalized Space to Image Space. Luckily there is VNImageRectForNormalizedRect(::_:) to do just that.

IOS Swift Kingfisher Resize processor result blurry image

I'm developing an app using Kingfisher library to load image from ulr and display in CollectionViewCell. I'm trying to resize the image to fit the Contentview of CollectionViewCell.
I have tried ResizeProcessor and ScaleFactor from the library but the result image seem blur. My implementation is below (function is call in CellForRowAtIndexPath)
let url = photo.flickrImageURL("m")
let size = contentView.frame.size
print("size is \(size.debugDescription)")
let resizeProcessor = ResizingImageProcessor(referenceSize: size, mode: .aspectFit)
self.flickrPhoto.kf.setImage(with: url, options: [.backgroundDecode,.processor(resizeProcessor), .scaleFactor(UIScreen.main.scale),.cacheOriginalImage])
Is there anything that I did wrong? The quality of images seem very blur.
You should set referenceSize to size of imageView instead of contentView. As size of your contentView will be bigger than your imageView
Change your code with below code:
let size = self.flickrPhoto.frame.size
UPDATED
Just found that ResizingImageProcessor works on pixel instead of points so you need to multiply scale into image size like below:
let resizingProcessor = ResizingImageProcessor(referenceSize: CGSize(width: self.flickrPhoto.frame.size.width * UIScreen.main.scale, height: self.flickrPhoto.frame.size.height * UIScreen.main.scale))

Crop UIImage to square portion

I have a UIScrollView which contains a UIImage. On top of that is a box that the user can move the image, so that that portion is cropped.
This screenshot explains it better:
So they can scroll the image around until the portion they want is inside that box.
I then want to be able to crop the scrollView/UIImage to exactly that size and store the cropped image.
It shouldn't be very hard but I've spent ages trying screenshots, UIGraphicsContext, etc. and cant seem to get anything to work.
Thanks for the help.
I finally figured out how to get it to work. Here is the code:
func croppedImage() -> UIImage {
let cropSize = CGSize(width: 280, height: 280)
let scale = (imageView.image?.size.height)! / imageView.frame.height
let cropSizeScaled = CGSize(width: cropSize.width * scale, height: cropSize.height * scale)
if #available(iOS 10.0, *) {
let r = UIGraphicsImageRenderer(size: cropSizeScaled)
let x = -scrollView.contentOffset.x * scale
let y = -scrollView.contentOffset.y * scale
return r.image { _ in
imageView.image!.draw(at: CGPoint(x: x, y: y))
}
} else {
return UIImage()
}
}
So it first calculates the scale of the imageView and the actual image.
Then it creates a CGSize of that crop box as shown in the photo. However, the width and height must be scaled by the scale factor. (e.g. 280 * 6.5)
You must check if the phone is running iOS 10.0 for UIGraphicsImageRender - if not, it won't work.
Initialise this with the crop box size.
The image must then be offset, and this is calculated by getting the scrollView's content offset, negating it, and multiplying by the scale factor.
Then return the image drawn at that point!

Getting size of an image in an UIImageView

I am having trouble getting the size of an image after it has been assigned to an UIImageView programmatically.
The code below runs in a loop and a function is used (getNoteImage) to download an image from an URL (photoURL) and assign it to the created UIImageView. I need to get the height of the image so that I can calculate the spacing for the following images and labels.
var myImage :UIImageView
myImage = UIImageView(frame: CGRectMake(0, 0, UIScreen.mainScreen().bounds.width, UIScreen.mainScreen().bounds.height))
myImage.center = CGPointMake(UIScreen.mainScreen().bounds.size.width/2, imageY)
myImage.getNoteImage(photoUrl)
self.detailsView.addSubview(myImage)
myImage.contentMode = UIViewContentMode.ScaleAspectFit
imageHeight = myImage.bounds.size.height
I have tried using
imageHeight = myImage.bounds.size.height
which I read as a solution on another post but his just returns the screen size for the device (667 on the iPhone6 simulator).
Can anyone guide me on how I can get the correct image size ?
Thanks
As your imageView is taking up the whole screen and the image is sized using 'aspect fit' to get the height the image is displayed at you will need to get the original image size using myImage.image!.size then scale this based on myImage.bounds.size something like;
let imageViewHeight = myImage.bounds.height
let imageViewWidth = myImage.bounds.width
let imageSize = myImage.image!.size
let scaledImageHeight = min(imageSize.height * (imageViewWidth / imageSize.width), imageViewHeight)
That should give the actual height of the image, note that image.size gives the "logical dimensions of the image" i.e. its natural size and not the size it is drawn at.
As UIImage official doc:
if let image = myImage.image {
let size = image.size
print("image size is \(size)")
} else {
print("There is no image here..")
}
I suppose your code working with synchronously image as I understand in your question, but if not I recommended to use AlamofireImage.
With AlamofireImage you can do:
self.myImage.af_setImageWithURL(
NSURL(string: photoUrl)!,
placeholderImage: nil,
filter: nil,
imageTransition: .CrossDissolve(0.5),
completion: { response in
print(response.result.value) # UIImage
if let image = response.result.value {
let size = image.size
print("image size is \(size)")
}
print(response.result.error) # NSError
}
)
change your code like this and try again. only you did note force set the frame of the UIImageView and then give it a image ,you can use "imageHeight = myImage.bounds.size.height" get the real size of the image.
var myImage :UIImageView
//remove your initial frame parameters
myImage = UIImageView()
myImage.center = CGPointMake(UIScreen.mainScreen().bounds.size.width/2, imageY)
try using myImage.image.size.height and also make sure the image is downloaded from the url you should write it in a completion block and only check if the image is downloaded.
See the documentation for more details.

iOS: Swift: How to get proper image quality with CGImageCreateWithImageInRect?

I am trying to make a simple Crop functionality with Swift. I am trying with CGImageCreateWithImageInRect function - which works perfectly but produce inferior quality. Am I missing something ?
func retriveCroppedImage(){
let yratio: CGFloat = imgviewrect.size.height / chosenImage.size.height
let xratio: CGFloat = imgviewrect.size.width / chosenImage.size.width
var cliprect = CGRectMake(centerpoint.x - vWidth/2, centerpoint.y - vHeight/2, vWidth, vHeight)
print("cliprect top \(cliprect.size)")
cliprect.size.height = cliprect.size.height / xratio;
cliprect.size.width = cliprect.size.width / xratio;
cliprect.origin.x = cliprect.origin.x / xratio + imgviewrect.origin.x / xratio
cliprect.origin.y = cliprect.origin.y / yratio - imgviewrect.origin.y / xratio
print("cliprect On Image \(cliprect)")
let imageRef = CGImageCreateWithImageInRect(chosenImage.CGImage, cliprect )
croppedImg = UIImage(CGImage: imageRef!, scale: UIScreen.mainScreen().scale, orientation: chosenImage.imageOrientation)
print("Operation complete");
}
Screen shots : Main VC
after cropping I get Cropped Image
After trying all the options - I found accidentally I set Alpha in Image View on the story board. There was nothing wrong with the CGImageCreateWithImageInRect function. Now my cropping app is working as desired. But thank you all for the suggestions.

Resources